Affectiva today announced the close of a $26 million funding round to advance its emotion and object detection AI for monitoring vehicle passengers. The funding round was led by automotive supplier Aptiv. Affectiva wants its solution to be incorporated into cameras used in car safety systems to recognize when a driver is happy, sad, drowsy, or frustrated.
In the future the company wants its detection systems to include more context about how vehicle passengers interact with each other and objects in a vehicle.
Through in-vehicle safety systems like the kind being developed by Nuance Communications, Affectiva’s computer vision system may recognize a driver is drowsy, then use text-to-speech AI to ask the driver if they would like some music, to change the temperature, or to recommend the car be pulled over.
Affectiva isn’t alone in its ambition to use computer vision and cameras to shape how vehicles respond to drivers, as companies like Seeing Machines and EyeSight, which raised $15 million last fall, attempt to do the same. Each wants to supply systems that satisfy government mandates for drowsiness and distraction detection systems like the kind currently being considered by the European Parliament.
Created in 2009, Affectiva began as a company making emotional intelligence AI for videos, but in recent years expanded to voice emotion detection in 2017.
Affectiva’s solutions for tracking emotion are currently used by a range of companies, including the makers of home robots and AI assistants, and in 2018 Affectiva introduced its solution for autonomous vehicles.
Affectiva will also use the funding to ensure its AI, trained by analysis of over 7 million faces, avoids algorithmic bias. The company also wants its AI to detect more subtle or complex moods and behaviors.
“One of the things we’re working on is this concept of cognitive overload when you’re overly stimulated because a lot of things are happening to you,” Affectiva CEO Rana el Kaliouby told VentureBeat in a phone interview. “So we’re trying to build a model that can quantify how cognitively overloaded are you and then the car can again start to engage in ways that make it a safer experience.”
Such insights could lead to nudges from an in-vehicle AI assistant, or prompt an autonomous driving system to take control.
Another major focus going forward for Affectiva is deploying on the edge and being embedded in vehicles and facing compute power limitations.
“We are trying to run this on the vehicle. We don’t want any data sent to the cloud. So that means a lot of work shrinking these models and getting them to run on the automotive-grade chips,” el Kaliouby said.
Beyond car safety systems, Affectiva sees applications for ride-sharing companies. Computer vision could help with things like object detection to recognize that a passenger left their bag in a driver’s car, for example, or detect poor user experience by reading a customer’s face for sentiment analysis to detect things like smiles or concern.
Affectiva also believes its mood-detection capabilities can be used to personalize experiences for drivers. For example, if you enter the vehicle and look stressed, the vehicle could respond by playing a favorite playlist or adapting lighting. Affectiva was part of an experimental Kia concept at Consumer Electronics Show in January that detected a passenger’s emotional state then used things like lighting and even perfumed scents to affect their mood.
“You could imagine in the future, it would be more like a wellness pod,” she said.
Affectiva is based in Boston and Cairo and currently has 45 employees. Affectiva has raised $53 million since the company was founded in 2009.