A pass is required to view this video.

Context Adaptation for Automotive Sensor Fusion

Sign in | Register to Bookmark
Jan Aelterman
Assistant Professor,

Gent University

Released on October 05, 2023

Sensor fusion is key to environment perception in challenging conditions like snow, hail, nighttime, lens flares… Nowadays individual sensor processing relies heavily on machine learning, requiring algorithm developers to create or simulate large amounts of challenging training samples. Unfortunately, this leads to a combinatorically increasing need for training data to create all combinations of challenging conditions (e.g. snow+hail+night+lens flare). We propose “context-adaptive” fusion as a solution: a probabilistic approach wherein an “interpretation layer” translates the output statistic of an existing sensor algorithm to one that is tuned to a particular challenging context like fog, snow, hail, nighttime, lens flares, or even distance. The advantage is that this approach allows to adapt to the challenging context without requiring modification of existing sensor algorithms, using only a very small number of training samples. It is a natural fit for sensor fusion architectures where edge AI is provided with a low-data-rate input of said contexts through what is called “cooperative” sensor fusion. This talk will thus demonstrate how a “cooperative” fusion architecture outperforms a standard sensor fusion pipeline in terms of detection accuracy and tracking performance by adapting to different contexts through the proposed “interpretation layers”.

Want to watch this session?

You must be logged in and have a pass to watch this session.
Scroll to Top