top of page

Machine Learning Leveraging In-Car Cameras & Sensors for Situational Awareness

Updated: Mar 16



According to the WHO, ~1.35 million people die each year due to road traffic accidents. With rising concerns over road safety, innovators from across the world are rallying to build solutions based on technology and to support drivers with systems that can enhance their safety while driving/riding. 


Be it airbags or seat belts, anti-lock braking systems (ABS), or electronic stability control (ESC) - technology has been at the forefront of improving vehicle safety. However, the advent of advanced driver assistance systems (ADAS) in recent years has ushered in a new wave of transformations in automotive industry. ADAS leverages modern sensors, cameras, radar, LiDAR, and AI to enable vehicles to observe, analyze, and respond to their surroundings in real-time. 

Camera/ Sensor & Functionality

Capabilities

Forward-facing or front cameras are mounted on the front windshield or grille.

These cameras capture visuals of the road ahead and detect road signs, lane markings, people, roadblocks/ objects, and vehicles in the path. They are used to implement solutions such as lane departure warning (LDW), lane-keeping assist (LKA), traffic sign recognition, automatic emergency braking (AEB), etc.

Rear-view cameras are usually positioned at the rear of the vehicle, near the license plate.

These cameras capture images and videos when reversing the vehicle, helping drivers avoid collisions with obstacles or people, even if they are in the vehicle's blind spots.

LiDAR (Light Detection and Ranging) sensors use pulses of laser to estimate the distance from other objects and to create accurate 3D maps of the surroundings.

LiDAR sensors have the capability to detect and classify objects such as vehicles, pedestrians, cyclists, and road signs. They also provide precise depth/spatial perception to navigate complex environments safely.

Radar sensors emit radio waves to measure reflections from other objects or people to estimate their distance, direction of movement and speed.

Radar sensors work quite accurately even in adverse conditions like darkness, rain, smog/fog etc. They can detect objects and people even at long distance and thus aid in collision avoidance and adaptive cruise control.

Ultrasonic sensors use sound waves to find if any objects are in the vehicle's proximity.

These sensors have an accuracy of a few meters and thus effective for parking assistance and obstacle detection at low speeds.


Innovations in Technology

As would be clear from the description of the cameras and sensors above, we have multiple means to gather inputs and interpret the vehicle's surroundings - be it other vehicles, pedestrians, obstacles, road signs etc. Machine learning (ML) relies on these inputs or data to analyze the environment within/outside the vehicle and processes it in real-time to determine patterns, threats, and danger. Here are the typical steps in an ML algorithm: 

  1. Preprocessing: It is advisable to remove irrelevant data, noise and anomalies while also standardizing the data formats for subsequent processing. This ensures consistency in the assessment.

  2. Feature extraction: Once preprocessed, the data undergoes transformations to extract its relevant characteristics - object size, shape, speed, distance, trajectory, etc.

  3. Assessment: Here we rely on training data and pre-labeled datasets where the input data is already paired with labels. Based on the extracted features and their correlation with the training data, we can classify objects based on features (vehicles, pedestrians, cyclists, and road signs), and/or identify patterns to reveal a trend/deviation (lane departures, rash driving, or likely collision).


Here are some examples of how this impacts road safety:

  • Rear-end collisions: 

    • AEB can detect impending collisions and apply the brakes to prevent or mitigate the impact. The Insurance Institute for Highway Safety (IIHS) estimates that AEB alone can prevent ~28% of rear-end collisions and reduce the morbidities in many others.

  • Lane departure accidents: 

    • LDW, and LKAS alert drivers when they drift out of their lane unintentionally and can even steer the vehicle back on safety. According to the National Highway Traffic Safety Administration (NHTSA), LDW systems can prevent ~7,000 fatal and 50,000 injury crashes every year in the United States.

  • Pedestrian accidents: 

    • ADAS systems can detect pedestrians in the vehicle's course and automatically apply the brakes to avoid an accident. In a survey conducted by AAA, it was concluded that drivers using ADAS features such as adaptive cruise control, LKAS, Driver monitoring systems (DMS) and AEB feel more confident while driving.


As is evident, ML/AI, computer vision, and modern sensors have proven pivotal in achieving unprecedented levels of situational awareness for vehicles. Drivers today are empowered to get alerts on potential dangers on the road in a timely manner. Be it collision avoidance to driver monitoring, these technologies are helping identify risks and safeguarding drivers/pedestrians on the road. 

The University of Michigan Transportation Research Institute (UMTRI) reported that while ADAS can improve safety, it will succeed only if we educate the drivers and train them on using these systems effectively. 

The global market for ADAS is exploding, spurred on by an increasing demand and innovations in sensor technology. Not to mention the advancements in AI. In a report by Grand View Research, the global ADAS market size is expected to reach ~$83 billion by 2028, growing at a compound annual growth rate (CAGR) of over 12%. But, automotive safety will thrive only if we continue to innovate, collaborate, and invest in new technologies - be it democratization of connected vehicles, evangelizing vehicle-to-vehicle communication, or embracing fully autonomous driving systems. 


However, this will not be easy. There are potent challenges ensuring reliability, guaranteeing cybersecurity, and following regulatory compliances, as well as treading the ethical and legal implications with a global consensus. The hope is that the projected reductions in accidents, injuries, and fatalities on the roads can be the motivation to spur us on.

Recent Posts

See All

Comments


bottom of page