An AutoSens Europe Pass is required to watch this video On-Demand
Get your pass for only
£249
Hear from:
Bringing higher levels of automated driving to mass production vehicles is one of the major missions in automotive market. One of the related challenges for highly automated driving systems is the last-mile autonomy including complex urban traffic as well as indoor parking scenarios.
Equipped sensors on mass production vehicles often have compromises on resolution, accuracy, and field of view. Without expensive high resolution lidar sensors and high precision localization devices, the on-board perception should still be able to provide comprehensive environmental modelling. State of art machine learning approaches and grid-based algorithms together with standard automotive radars and cameras can overcome sensor limitations.
In addition, relevant objects in occlusion cannot be detected early enough to ensure safe control of the car movement. Furthermore, on-board perception cannot derive all relevant scene descriptions and interpretation data including urban traffic regimes and cultures, such as restricted and shared zones, as well as reserved parking spaces.
To tackle those complex scenarios, the ADAS system needs to have an enhanced environmental perception beyond on-board sensing only. Additional data sources including swarm perception from other vehicles and scene interpretation data from road infrastructure can be incorporated and aggregated together with on-board perception data. We will describe how to design a connected perception architecture which implements external data sources and matching machine learning algorithms to solve these challenges.