Sensor Innovations in ADAS are Saving Lives on the Road

Safety on the roads is a massive problem – every year, more than 1.1 million people are killed due to road traffic crashes[1], and an estimated 20 to 50 million people suffer injuries.
 
A major cause of these crashes is driver error[2]. Carmakers and government regulatory agencies are always looking for ways to improve safety, and in recent years there have been big advances in how Advanced Driver Assistance Systems (ADAS) can help to reduce deaths and injuries on the roads.
 
In this article, we’ll look at the role of ADAS in improving safety, and the various sensor technologies that play a vital part in making this possible.

The Evolution and Importance of ADAS

Since the first anti-lock braking systems (ABS) were introduced in the 1970s, there has been a steady increase in the deployment of ADAS technologies in passenger vehicles, with corresponding improvements in safety. In the U.S. alone, the National Safety Council (NSC) estimates that ADAS has the potential to prevent about 62% of total traffic fatalities, saving more than 20,000 lives each year[3]. ADAS features like automatic emergency braking (AEB) and forward collision warning (FCW) have become commonplace in recent years, with more than a quarter of vehicles making them available to assist drivers in preventing accidents and ultimately save lives.

ADAS requires multiple technologies working in conjunction with each other. A sensing suite acts as the “eyes” of the system to detect the vehicle’s surroundings to provide data to the “brains” of the system, which uses that to compute actuation decisions for the vehicle to assist the driver – for example, by automatically applying the brakes to stop the vehicle in time to avoid a rear-end collision with AEB when a vehicle ahead is detected and the driver has not stepped on the brakes themselves. The ADAS sensing suite is made up of a vision system comprising of an automotive-grade camera whose heart is a high-performance image sensor that can capture a video stream of the car’s surroundings, which can be used to detect vehicles, pedestrians, traffic signs, and more, and display those images to help the driver in low-speed maneuvering and parking situations. The camera is often paired with a depth sensing modality like radar, LiDAR, or ultrasonic sensor which provides depth information to augment the camera’s two-dimensional images, adding redundancy and removing ambiguity with distance measurements to objects.

Implementing ADAS can be a challenging problem for automotive manufacturers and their Tier-1 system suppliers. There are limitations on the processing power available to handle all the data generated by multiple sensors, and the sensors themselves have performance limitations. Automotive industry requirements dictate that every component must have excellent reliability, including not just the hardware but also the associated software algorithms, and extensive testing is thus needed to ensure safety. The systems must also perform consistently in the most difficult lighting and weather conditions, be able to cope with extreme temperatures, and operate reliably for the lifetime of the vehicle.

Key Sensor Technologies in ADAS

Let’s now look at some of the key sensor technologies that are used in ADAS in more detail, including image sensors, LiDAR, and ultrasonic sensors. Each of these sensor modalities provides a certain type of data to be processed by the software algorithms that receives them in conjunction with each other to generate an accurate and comprehensive understanding of the environment. This process is called sensor fusion, which can improve the accuracy and reliability of the software perception algorithms to drive higher confidence decisions through the redundancy of multiple sensor modalities and enable higher levels of safety. The complexity of these multi- sensor suites can escalate quickly, with algorithms requiring ever more processing power. However, the sensors themselves are also becoming more advanced, allowing processing to be done locally at the sensor level instead of on the central ADAS processor.

Automotive Image Sensors

Image sensors are the “eyes” of the vehicle– arguably the most important sensor type in any vehicle outfitted with ADAS. They provide image data that can be used to enable a wide range of ADAS features, from “machine vision” driver assistance features like automatic emergency braking, forward collision warning, and lane departure warning, to “human viewing” 360-degree surround view cameras for parking assistance and camera monitor systems for electronic mirrors, to driver monitoring systems that can detect a distracted or drowsy driver and sound an alarm to prevent accidents.

onsemi offers a wide range of image sensors, including its Hyperlux family, which deliver excellent image quality with low power consumption. The Hyperlux sensor pixel architecture includes an innovative super-exposure imaging scheme, which can capture high dynamic range (HDR) frames with LED flicker mitigation (LFM), overcoming issues of misinterpretation caused by pulsed LED front and rear vehicle lighting or LED traffic signs.

Hyperlux image sensors are designed to perform in challenging automotive scene conditions such as direct sunlight above an overpass by capturing up to 150 decibels (dB) of dynamic range. Cameras with Hyperlux image sensors can handle corner cases with a performance level that is much better than the human eye, operating well at levels far below 1 lux.

onsemi’s Hyperlux image sensors include the 8-megapixel AR0823AT and the 3- megapixel AR0341AT. These are digital CMOS image sensors which use the Hyperlux 2.1 μm super-exposure single photodiode pixel technology to deliver excellent low-light performance, while also capturing a wide dynamic range in scenes with low and high illumination in the same frame. The super-exposure pixel enables enough dynamic range in one frame to allow a “set-it-and-forget-it” exposure scheme, effectively eliminating the need for auto-exposure adjustment as lighting conditions change, such as driving out of a tunnel or parking garage on a sunny day.

Depth Sensors (LiDAR)

Measuring precisely how far away objects are from the sensor is known as depth sensing. Depth information removes ambiguity from the scene and is essential for various ADAS functionalities and to enable higher levels of ADAS and fully autonomous driving.

There are multiple technologies that can be used for depth sensing. Light Detection and Ranging (LiDAR) is the best choice if depth performance is to be considered. LiDAR enables depth sensing with high depth and angular resolution, and can operate in all ambient light conditions, due to the system having active illumination by way of a near-infrared (NIR) laser paired with a sensor. It is suitable for both short-range and long-range applications. While lower-cost radar sensors are more prevalent in automotive applications today, they lack the angular resolution of LiDAR and cannot provide a high-resolution three-dimensional point cloud of the surroundings that is needed for levels of autonomy beyond basic ADAS.

The most common type of LiDAR architecture is direct time-of-flight (ToF), which involves sending out a short infrared pulse of light and measuring the time taken for the signal to reflect from the object and return to the sensor, thus enabling the distance to be directly calculated. The LiDAR sensor replicates this measurement across its field of view by scanning the light across to capture the entire scene.

onsemi’s ARRAYRDM-0112A20 silicon photomultiplier (SiPM) array is a single photon-sensitive sensor with 12 channels in a monolithic array, with high photon detection efficiency (PDE) at NIR wavelengths like 905nm for the detection of returned pulses. This SiPM array has been integrated into a LiDAR[4] outfitted in one of the first passenger vehicles in the world to offer drivers the convenience of true “eyes-off” autonomous driving. This enables a feature that goes beyond basic driver assistance referred to as self-driving, where the driver can effectively stop paying attention to the road, which has not been shown to date to be reliably rolled out in consumer vehicles without LiDAR depth sensing.

Ultrasonic Sensors

Another technology used for distance measurement is ultrasonic sensing, where a transducer emits a sound wave at frequencies beyond the range of human hearing, and then detects the sound that bounces back, thus enabling distance measurement from time-of-flight.

Ultrasonic sensors are utilized in close-range obstacle detection and low-speed maneuvering applications such as parking assistance. One advantage of ultrasonic sensors is that sound is much slower than light, so the time for a reflected sound wave to return to the sensor is typically a few microseconds, as opposed to nanoseconds for light – which means ultrasonic sensors require much lower- performance processing, hence reducing system costs.

An example of an ultrasonic sensor is onsemi’s NCV75215 parking distance measurement ASSP. During vehicle parking, this part operates with a piezoelectric ultrasonic transducer to provide time-of-flight measurement of an obstacle’s distance. It detects objects at distances from 0.25 m up to 4.5 m, and provides high- sensitivity, low-noise operation.

Conclusion

onsemi has played a major role in the development of the sensor technologies required for ADAS. onsemi invented the dual conversion gain pixel technology and HDR operation now used in many sensors across the industry, and pioneered the innovative super-exposure design that enables the sensor to provide both excellent low-light performance, and the ability to capture HDR scenes without saturation from a single photodiode. This market and technology leadership has led to the majority of ADAS image sensors on the road today being ones developed by onsemi[5].

These innovations have enabled onsemi to provide high-performance sensors for automotive applications for two decades, in turn allowing ADAS to make a dramatic impact on improving vehicle safety.

The automotive industry is continuing to invest heavily in ADAS and pursue the goal of full autonomy for vehicles – moving beyond basic features that assist drivers, as defined by SAE6 as Levels 1 and 2, to genuine self-driving capabilities, as defined by SAE as Levels 3, 4 and 5[6]. Reducing road deaths and injuries is one of the major motivations behind this trend, and onsemi’s sensor technologies will play a vital role in this transformation in automotive safety.

Interested in this technology?

Learn more about onsemi’s sensors and their capabilities in the ADAS system solution guide.

Don’t miss key conversations at AutoSens Europe. Get your pass!
Shopping cart0
There are no products in the cart!
0
2024 ADAS Guide

The state-of-play in today’s ADAS market

With exclusive editorials from Transport Canada and SAE;  the ADAS Guide is free resource for our community. It gives a detailed overview of features in today’s road-going vehicles, categorized by OEM, alongside expert analysis.