Original article was published by Valentina Žeželj on Artificial Intelligence on Medium
Growing automotive AI companies & the many faces of AI in cars
Everything is about user experience these days and cars are no exception. Artificial intelligence is increasingly being used to help meet consumer expectations and build vehicles that are safer and more comfortable than ever.
At the moment, most new vehicles on the market already come with some AI-powered functionalities, while others are still being tested and developed. Let’s take a look at some of the most interesting automotive AI companies and exciting achievements in the field that have a direct impact on the development of autonomous vehicles.
Automotive AI companies that are making a difference
Autonomous driving is, without a doubt, the main goal of the automotive industry. However, AI can take on many roles inside the vehicle, improving safety and comfort in the cabin. Tier 1 vendors and OEMs are increasingly partnering with suppliers of IT software solutions and services to stay competitive on the market.
Veoneer — Driving assistance and automation
Autonomous driving is the main goal of the intense race run by automotive companies worldwide. Instead of being behind the wheel, the future generations might get to enjoy watching landscapes and digital screens instead, riding more safely and comfortably than ever at the same time.
However, before AI can fully replace human drivers, it needs to be gradually introduced. Taking over the driving step-by-step not only helps develop and thoroughly test new functionalities, but also improves driver-vehicle trust.
Creating trust in mobility is also the main guiding principle of Veoneer, the world’s largest pure-play company focused on Advanced Driving Assistance Systems (ADAS) and Autonomous Driving (AD).
Veoneer’s vision systems act as an additional pair of eyes in order to assist the driver. They detect and track objects in traffic such as traffic signs, other vehicles, pedestrians, etc. By doing so, the system can warn the driver or intervene when necessary. For example, if the system detects a potential collision, it can trigger emergency braking. ADAS features also include adaptive cruise control, lane keep assist, lane departure warning, traffic sign recognition, and more.
Veoneer also developed the world’s first publicly-announced technology that meets the requirements for level 4 autonomous driving. Their ADAS/AD ECU (Engine Control Unit) under the name of Zeus acts as the ‘brain’ of a vehicle. It fuses data from cameras, radars and various sensors in order to interpret the traffic situation and take action, even in complex situations.
Although we have yet to enjoy complete driving automation on the road, AI-supported vehicles are already bringing safety and convenience to a whole new level.
Optalert — Drowsiness detection
Similarly to drugs and alcohol, drowsiness can significantly slow drivers’ reaction time, decrease awareness, and impair their judgment. Some people also experience microsleep — short, involuntary periods of sleep that can last up to 30 seconds. All of this makes drowsiness one of the most common causes of car crashes.
Detecting signs of drowsiness helps protect not only the driver and the passengers in the vehicle, but also other people on the road.
Optalert, a medtech company with more than two decades of research and development in the field of drowsiness detection, has developed a smart drowsiness and attentiveness algorithm that can be incorporated into driver monitoring systems.
The company uses blepharometry (the study of eyelid movements) and patented algorithms to quantify drowsiness. The algorithm monitors the drivers’ face for common signs of drowsiness such as slow blinking, eye closing, yawning, struggling to hold the head up, etc. It then defines the level of drowsiness on a 10-point scale — from the score of 0 meaning ‘very alert’ to 10 standing for ‘very drowsy’. If drowsiness is detected, the system can trigger the appropriate safety measures.
Being drowsy means you could be just a few moments away from sleep. Detecting drowsiness in its earliest stages can help protect drivers, even when they themselves don’t realize they may be at risk.
EyeSight — Driver monitoring
Driver monitoring systems (DMS) play a huge role in the transition from manual to semi-autonomous and fully autonomous vehicles. A reliable DMS supports ADAS by gathering valuable information about the driver.
Eyesight Technologies is one of the growing number of companies focused on driver monitoring. Their driver and occupant monitoring solutions help create safer and smarter in-cabin experiences for cars and fleets.
A DMS checks whether the drives is focused on driving by monitoring their visual attributes, such as head pose, blink rate, gaze vector, and more. Real-time monitoring helps detect if the driver is drowsy, inattentive or distracted, and trigger safety measures when necessary.
For example, driver’s head dropping down, closed or barely open eyes and sudden head movements are common signs of drowsiness. Once the system detects them, it can trigger alerts and precautionary measures at the right time. For example, the system can warn the driver to pay attention to the road, take a break, slow the car down, or similar.
EyeSight’s occupancy monitoring system, on the other hand, keeps an eye on the entire in-cabin environment. It can detect how many passengers there are in the cabin, who they are, what their posture is like, etc. This helps adjust safety features to current passengers. For example, airbag deployment can be adjusted according to passenger size and position, seat-belt alerts triggered when worn incorrectly, the in-car environment adjusted to the number and demographics of the passengers present, and more.
Despite significant advances in AI, human drivers still have the main responsibility for getting the vehicle to the destination safely. This, of course, comes with the risk of human error — the cause of most car accidents. Driver monitoring helps cars detect and respond to occupants’ needs and make the cabin safer and more comfortable for everyone in it.
Affectiva — Emotion AI
Affectiva, an MIT Media Lab spin-off, specializes in Emotion AI — emotion recognition technology that analyzes facial expressions and emotions. Affectiva’s SDKs and APIs can add emotion-sensing and analytics to applications and devices. Their technology is also used to help monitor what’s happening inside of a vehicle.
By using cameras and microphones, Affectiva Automotive AI measures the emotional and cognitive state of the vehicle’s occupants from face and voice data. It measures facial expressions and emotions such as joy, anger and surprise, as well as vocal expressions of anger, arousal and laughter.
The technology can be used by OEMs and Tier 1 suppliers to increase safety, facilitate the handoff of controls and personalize the in-cabin experience.
Visage Technologies — In-cabin sensing
While most companies focus on developing a specific solution, Visage Technologies offers their automotive clients full flexibility. In other words, the company develops cutting-edge computer vision software which is then used to build specialized, custom solutions. Before I go into more detail, I want to add a quick disclaimer that I work at Visage Technologies.
Visage Technologies is focused on reading human faces with technology. In practice, this translates into smart face tracking, analysis and recognition technology used for driver monitoring and in-cabin sensing.
It enables reliable monitoring of the driver and occupants in real time — from their emotions and cognitive states, such as drowsiness and distraction, to occupancy, activity, and child detection. The purpose is to gather as much valuable data as possible in order to help address safety concerns on time, prevent accidents caused by human error and optimize the in-cabin experience.
More precisely, the face tracking and analysis technology developed by Visage Technologies tracks almost a hundred facial points in real time, detects gaze direction, estimates emotions, pinpoints occupant’s demographics, etc.
Knowing how attentive the driver is, where they are looking, how they are feeling, etc. helps detect (and respond to) potentially dangerous states such as fatigue, inattentiveness, distractedness, road rage, etc in order to prevent accidents caused by human error. For example, if the system detects anger, it can trigger safety measures such as playing relaxing music or slowing the car down. If the driver seems distracted, it can warn them to keep their eyes on the road or take a break.
Besides helping make the ride safer, such technology can also be used to improve comfort for everyone in the cabin. It can, for example, be used to count the passengers, identify them, detect and track their faces, pinpoint their demographics, estimate their moods and states, and more. Such data can help adjust safety measures (such as mirrors, seat belt, etc.), personalize in-cabin settings such as lighting or heating, adjust entertainment content (for example, block content that is not family friendly if there is a child present), protect the most vulnerable passengers (for example, warn the parents about a child being left unattended in the vehicle), and more.
Finally, face recognition provides another layer of safety for the vehicle and its occupants. For example, only the verified persons can get access rights, which prevents car theft. The system can also remember someone’s in-cabin preferences, such as lighting, heating, music, seat position, etc., and automatically restore them when an authenticated person enters the car.
Visage Technologies provides their technology in form of a comprehensive SDK. However, they also offer in-house custom development, allowing their clients in need of a custom, specialized solution complete flexibility.
The company’s strictly organic growth (it has been listed among the fastest-growing tech companies in EMEA since 2017) proves that Visage Technologies has been recognized as a reliable technology provider for a growing number of automotive companies.
German Autolabs — Voice AI
Voice assistants such as Alexa and Google are quite common in our homes these days. We are getting used to telling machines what we need and expecting them to respond in the best way possible. Such technology is also proving quite valuable and desirable in cars.
German Autolabs built the world’s first in-car voice assistant and they keep innovating in the field ever since. The company specializes in automotive, AI-powered voice assistance for professional drivers.
Drivers, especially professional ones, often have to concentrate on several things at once. This can lead to stress and distractions, which can result in accidents. Intelligent, personalized voice assistants can help make fleets safer and more efficient.
German Autolab’s AI assistant comes with advanced natural language capabilities and gesture recognition technology. Drivers get to enjoy safe and easy device interactions, while keeping their hands on the wheel and eyes on the road. For example, they can safely operate apps, navigate, call, play music, etc. while driving.
AI-powered voice assistants enable a more comfortable and accessible human-device interaction. They give the driver more power and control in less time and effort, making driving safer.
On the road to automation
The market for artificial intelligence in the automotive industry is predicted to surpass $12 billion by 2026, proving that automotive AI is big business indeed. However, the road to fully automated driving is slow and winding. Before we can fully rely on AI, it needs to be able to handle challenging and often unpredictable real-life conditions.
By gradually introducing new AI features, car manufacturers can test and develop improved functionalities, while giving the consumers time to warm up to the idea of having AI in the co-pilot (and, ultimately, the driver’s) seat. Since AI has become one of the main differentiating factors on the market, we can expect many more exciting things to come.