Advanced Driver Assistance Systems (ADAS) are redefining vehicle perception. We chat with Miquel Testar, Director of Product Management for ADAS sensors at Bosch. With over 16 years of experience, including 12 years focused on ADAS, Miquel shares insights into the current state of ADAS sensor technology and the exciting future of automotive safety and automation.
Miquel, tell us about your journey. How did you land at Bosch in the area of ADAS?
- First of all, I am originally from Catalonia region and was therefore really excited to be back home in Barcelona to discuss about one of my passions: automotive sensing.
- I have an academic background on electrical engineering and economics and this helps me to bring together market and technology knowledge in my current role as Director of Product Management for ADAS sensors at Bosch.
- I have 16 years experience in the field of sensors and 12 years specifically for ADAS, that I have gathered both in technical and commercial roles, mostly for radar but also for lidar, camera and ultrasonics. I have done it both based in Germany focusing on ADAS for established OEMs as well as based in California for L4 players.
We see a broad spectrum of capabilities within lidar and imaging radar, and many imaging radar products (large aperture, dense array) claim lidar-like performance. Will they ever get there?
- In raw performance (angular separability for static targets) it is going to be challenging for radar to catch up with lidar.
- The goal for radar technology is to improve its performance at functional level and improve its contribution to the critical use cases where it struggles today (e.g. lost cargo, pedestrian next to metal structure). At Bosch we optimize for the performance on the road with critical use cases rather than just for the paper in form of KPI sheets. Most important, we do this from the very beginning as in the specification of the radio frequency parameters of the radar chip.
- We are also seeing deep learning-based radar processing can deliver a huge performance delta as we saw with video some years ago.
- Other technologies like distributed coherent radar and broadband digital radar can bring an additional performance delta that will further close the gap.
- Another great thing about radar is that it has complementary failure modes to optical-based solutions (like camera or lidar).
With different models and different ODDs needing different levels of redundancy and different mixes of sensor modalities, how should an OEM build a perception stack that scales, and avoid reinventing the wheel for every ADAS application?
The two key elements would be:
- Maximize the data reuse across trim levels. Ideally having a single data acquisition path based on a superset for the trim level with the most advanced ADAS feature set.
- Optimize the cost for the volume high runner trim level. This means the other trims will not be individually cost-optimized but this is the price to pay in the trade-off.
When we see imaging radar and lidar used together, what will the impact be of selecting early sensor fusion or late sensor fusion – is there a clear best approach here?
- Early fusion can enable a performance improvement by given sensors. Or alternatively enable a reduction of sensor raw performance requirements while keeping the performance.
- This needs to be combined with late fusion though, to increase safety and make the system more robust.
When one of the modalities in the sensor fusion degrades, how do we monitor that and account for that?
- There are internal sensor algorithmics that measure whether the sensor is currently blocked, misaligned or the performance is degraded. All this information is then delivered to the perception to weight the confidence level of each sensor detection point properly in the overall perception result.