US company Affectiva plans to develop a joint automotive assistant which detects driver distraction and drowsiness and voices recommendations such as navigating to a coffee shop. The solution is intended to align its dialogue to a motorist’s emotional state based on facial and verbal expressions.
The integrated solution will combine the Affectiva Automotive AI solution with UK-based Nuance Communications’ Dragon Drive platform.
Affectiva Automotive AI measures facial expressions and emotions such as anger and surprise as well as verbal expressions in real-time. It also displays icons which indicate drowsiness such as yawning, eye closure and blink rates and physical or mental distraction.
Through the partnership, Dragon Drive will enable the in-car assistant to interact with passengers via emotion and cognitive state detection. It currently facilitates this correspondence through gesture, touch, gaze detection, voice recognition powered by natural language understanding.
Stefan Ortmanns, executive vice president and general manager, Nuance Automotive, says these additional modes of interaction will help its OEM partners develop automotive assistants which can ensure the safety and efficiency of connected and autonomous cars.
In the future, the automotive assistant may also be able to take control of semi-autonomous vehicles if the driver displays signs of physical or mental distraction.
Affectiva and Nuance to develop humanised automotive assistant
US company Affectiva plans to develop a joint automotive assistant which detects driver distraction and drowsiness and voices recommendations such as navigating to a coffee shop. The solution is intended to align its dialogue to a motorist’s emotional state based on facial and verbal expressions. The integrated solution will combine the Affectiva Automotive AI solution with UK-based Nuance Communications’ Dragon Drive platform. Affectiva Automotive AI measures facial expressions and emotions such as anger
UTC / September 7, 2018