By Junko Yoshida, Editor-in-Chief, The Ojo-Yoshida Report, reporting from AutoSens in Brussels 2021
Significant among all the advancements in vehicles with more automated features, sensors provide the foundation that enables vehicles to see –and comprehend –the surrounding world. The accuracy of sensors can make or break Advanced driver-assistance systems (ADAS) and autonomous vehicles (AV). Given their critical importance, AutoSens conferences continue to focus on the robustness of sensors.
Many traditional suppliers have refrained from highlighting any glaring weaknesses in their sensor modality, but that era of laissez faire has passed.
As ADAS features proliferate in new models, carmakers face pressures to clarify what their automated functions can or cannot do. Similar pressure should be applied to regulators and testing agencies. Today, the National Highway Traffic Safety Administration (NHTSA)and EuroNCAP, for example, only evaluate vehicle safety in normal conditions.
If the automotive industry defines ADAS —and really means it —as a system to “assist drivers for better road safety,” it dares not turn blind eye to the inconvenient truth that today’s ADAS vehicles (or AVs) are vulnerable when driving in the dark, into the sun, in fog, heavy rain or snow. Just as human drivers have a hard time driving in bad weather, so do machines.
People routinely drive after sunset or in heavy rain. Bad weather or night-time driving should not be “corner cases” for vehicles with automated features.
The vulnerability of sensors is well known since, in the fall of 2019, AAA released its research report on Automatic Emergency Braking (AEB) with Pedestrian Detection. It was shocking to see the AAA footage in which AEB-P equipped ADAS vehicles kept mowing down crash-test dummies, one after another, during a closed-course trial.
The tests proved that AEB-P function in four reputable vehicles (all equipped with a combination of vision sensors and radar) did not work as designed. Every system failed to detect pedestrians some of the time. Further, AAA only released its daytime test footage because none of the cars stopped once during nighttime tests.
Software-driven sensors
The automotive industry has seen myriad new solutions designed to improve the sensing capabilities of current-generation sensors.
Just as the idea of the “software-defined vehicle” has driven the system-level requirements and architecture approach, “software-defined sensors” are also prompting sensor companies to look at not just their hardware, but also software that runs on top of the hardware, and ultimately, at the sensor fusion stack.
Algolux is one of the most visible purveyors of the software-driven approach. It currently offers two different software tools –Atlas and Eos.
The Atlas Camera Optimization Suite by Algolux can “optimize the accuracy of any existing cameras up to 20 to 25 percent,” Dave Tokic, vice president of marketing and business development, told The Ojo-Yoshida Report. The operative word here is that the tools can be applied to “any existing camera.”
Algolux claims that its Atlas uses “the industry’s first set of machine learning tools and workflows that automatically optimizes camera architectures for optimal image quality or for computer vision.” The upshot is that customers can get improved computer vision results “in days” rather than months, according to Tokic.
By contrast, Algolux’s Eos bypasses image signal processing and focuses on the embedded perception stack. “We’ve demonstrated up to three times improvement,” noted Tokic, “in vision system robustness in all conditions.” The objective for Eos is to overcome the shortcomings of computer vision in harsh lighting and poor weather. Eos’ claim-to-fame is an efficient end-to-end deep learning architecture applicable to any camera lens/sensor configuration or full multi-sensor fusion, according to the company.
Stradvision is another company offering software tools that sensor companies can use to improve their products’ sensing capability.
With a tool called SVNet, Stradvision offers software solutions that improve a vehicle’s perception, enabling it to better detect and recognize objects. SVNet, according to Stradvision, runs deep learning-based object detection software on automotive ECUs, with minimum computational requirements and power consumption. SVNet has been ported onto TI TDA2x and Renesas’ V3H, V3M, and H3.
New sensor modalities
Beyond the use of standard cameras and radar, new sensor modalities are getting a lot of attention. Flir, for example, is pitching its thermal cameras to make ADAS and AV systems more efficient.
In parallel, Flir is offering its thermal dataset free for ADAS and AV algorithm developers. The company explained that it has, for reference, a set of more than 14,000 annotated summer driving thermal images captured at day and night with their corresponding RGB imagery. The dataset includes classification of five groups, including people, dogs, cars, bicycles, and other vehicles.
Meanwhile, TriEye is promoting a CMOS-based Short-Wave Infrared (SWIR) sensing technology to automakers. The claim is that it makes imaging possible in “all weather and lighting conditions.” Unlike current cost-prohibitive SWIR solutions, TriEye says its patented structure and CMOS manufacturing process can dramatically reduce the expense of a SWIR sensor.
New LiDARs
Suppliers of individual sensors are also improving their products’ object detection and perception through heavy use of software.
Aeye, for example, touts a so-called “bistantic lidar architecture,” with separate transmission and reception channels. The company once explained that this division enables Aeye’s iDAR to bring more resolution where needed.
One known issue with LiDAR sensors is the degradation of its performance in rain. If a lidar beam intersects with a raindrop close to the transmitter, it can reflect enough of the beam back to the receiver to register rain as an object. The droplets can also absorb some of the emitted light, degrading the sensors’ range of performance.
To combat this, Aeye’s customers can create a library of deterministic, software-configurable scan patterns at design time, each addressing a specific use, according to the company. In addition to creating different patterns for highway, urban, and suburban driving, or an “exit ramp,” Aeye explained that the customer can scan for those driving environments optimized for bad weather —such as “Highway rain scan pattern” vs “Highway sunlight scan pattern.”
Another LiDAR company Opsys is building on the strengths of its technology (based on a high-performance SPAD and VCSEL lidars) to offer superior resolution and scanning rate. The key to Opsys —or any other new-generation sensor technologies —is whether its sensors can meet customer demands for flexibility and scalability.
With its “micro-flash lidar,” Opsys, for example, has developed customizable solutions for OEMs and tier ones as they work on different types of vehicles, such as L3 Highway Pilot vehicles or L4 autonomous trucks. Each type of vehicle presents a different field of view(FoV). Because Opsys’ micro-flash lidar bundles multiple base sensors into a single system, it can offer an integrated single 4D point cloud with a flexible FoV, the company explained. It’s becoming increasingly clear that robustness in ADAS or AV’s sensing system won’t happen by improving a single sensor. If the industry’s goal is indeed to make next-generation ADAS and AVs stand up to “all conditions,” it takes a village.
While a host of remarkable hardware advancements continue to emerge in lidar, radar and vision, the safety of vehicles with automated features still needs vastly improved performance in perception and sensor fusion. For that, software is the keyE