You must be logged in  to watch this session

Please either login or create an account.

The Real-World Value of Cooperative Sensor and Multi-Exposure Fusion in Automotive Perception

Event: AutoSens Brussels
| Published: September 2022

Hear from:

Jan Aelterman
Assistant Professor,

Gent University

Sensor fusion is key to robust environment perception. Unfortunately, the throughput requirements of “data-level” fusion are prohibitive, a problem exacerbated by higher-fidelity sensors: e.g. 16+bit HDR vs 10-bit video. Practical architectures instead rely on “late” fusion: Each sensor processes its data into low-throughput semantic data before fusion. This limits the potential accuracy improvement. Instead, Imec/Ghent University proposes “cooperative” fusion, introduced at Autosens 2019 by prof. Philips. This retains the simplicity of late fusion but increases robustness/accuracy: sensors improve their decision using well-chosen feedback from other sensors. Unfortunately, increased-fidelity-sensors like HDR cameras cause increased memory and compukevontational requirements. We demonstrate that this problem can be avoided using content- and picture-quality-preserving HDR-to-SDR conversion.
This talk further covers real-world benefits of cooperative fusion, using Radar/Lidar/HDR-camera traffic data acquired in Belgium. These benefits are realized on many key performance indicators: vulnerable road user (VRU) detection/tracking accuracy and stability and processing-induced latency (track initialization delay).

Shopping cart0
There are no products in the cart!
0