Driving ADAS technology and autonomous vehicles into the future with Siemens

matthieu worm
Matthieu Worm, Director of Autonomous Vehicles at Siemens Digital Industries Software

We talk to Matthieu Worm, Director of Autonomous Vehicles within the Simulation and Test Solutions Group at Siemens Digital Industries Software, our lead sponsor of our AutoSens Detroit Edition. Matthieu shares with us how Siemens is adapting to the challenges of 2020, his views on validation, and what we can expect from Siemens at AutoSens this year. 

I am sure like many of us that Siemens Digital Industries Software have spent much of 2020 learning new ways of working, are you able to share any details of these?

Being a company that lives and breathes digitalization, we already were well equipped when it comes to remote working before the Corona crisis hit in. What is most interesting from our perspective right now is the massively growing interest in digitalization throughout the industry. Ensuring that engineers have access to the all relevant and validated data in near real-time is absolutely critical when you cannot walk to your colleague’s desk to get the last design or specification of part. Online, model based engineering environments and methodologies that have been the norm in software and core electronics engineering for years are now being adopted even quicker throughout the entire supply chain. In order to help our customers in the transition towards digitalization through a model based engineering approach, we have accelerated the digital marketing shift and instantly replaced physical events with an online offering. A good example of that are the Realize Live user conferences of which recordings can be found back here.

The bigger picture in automotive is still the promise of zero crashes through achieving autonomous mobility, what do you see as the biggest hindrance to achieving this promise?

Driving ADAS technology and autonomous vehicles (AVs) into the future is a paramount challenge. It requires state-of-the-art components from multiple domains to be able to push the boundaries of driving automation, extensive testing, verification and validation to ensure the safety and comfort of these systems in itself. But also the deployment into mobility systems, connected with other vehicles (V2V) and infrastructure (V2I), which is a prerequisite to make higher levels of autonomous driving possible. With technology companies disrupting the automotive industry at unprecedented levels, traditional carmakers know they need to change to be able to survive the present and win the future. These challenges come on top of on-going business pressures of cost reduction, margin growth, quality issues and maintaining brand values. While keeping up with competition and dealing with the risk for disruption requires big investments, there is a need to develop robust engineering methodologies and toolchains to ensure we can all feel safe with the arrival of more automation in driving.

We held a panel discussion recently on whether validation by Shadow driving was a key enabler for L4 or dangerous public tests, what is your take on this question?

One thing important is the distinction between verification and validation. First, carmakers and suppliers need to verify whether their ADAS and AV systems comply with requirements under all possible circumstances within the defined Operation Design Domain. For this stage of the vehicle development process, shadow mode testing has no role. In the next step, validation of the systems is needed to see if the car drives not only safely, but also behaves as you want it to behave; ensuring a comfortable ride to its passengers for example. Shadow mode testing will be an element in this validation stage with the major advantage of having a huge number of ‘test drivers’. The difficulty with shadow mode testing is the open-loop aspect of it. If the controller that runs in shadow mode suggests another action than the controller that operates the vehicle, you cannot get any insight in the effects of that different action since it won’t be executed. For that judgement, you need a virtual environment that can replay the seen scenario. And this matches with the approach we take with Simcenter for the autonomous vehicle domain: Integrating data capturing, data processing and simulation into a single solution portfolio.

In your technical presentation at AutoSens you will discuss Unsafe and Unknown Scenarios/Critical Scenarios, what would be an example of such a scenario?

Rather than describing an example of an unknown unsafe scenario, I would rather like to highlight that the key is to increase test coverage beyond the limitations of the scenarios you have described as a result of safety analyses and regulation. By applying statistical and AI techniques on known scenario descriptions, we can generate 1000s of scenario variants and identify the critical ones through closed loop simulation. This makes a falsification approach possible, in which we actively search for scenarios that make the autonomous vehicle fail. We will describe this approach and the Simcenter Prescan360 solution for massive validation and verification at Autosens.

You will talk about the creation of a digital twin. The digital twin has long since established itself in industry, how does this work in relation to automotive? Are you creating digital twins of both the city, ie the car’s surroundings and the car itself? How much detail is gone into?

The Siemens Xcelerator digital twin solution portfolio is unique in the fact that it stretches from Chip to City. Using a comprehensive digital twin that provides a mirror image of the vehicle and its surroundings, starting from the core electronics and embedded software, through the vehicle and up to the mobility system-level implementation, streamlines the development and validation process. The fidelity level of the simulation always depends on the use case and the object under test. For the development of a radar system, we enable the chip-designers to emulate every detail of the chip design before it is turned into silicon preventing the creation of a waver that contains mistakes. For the perception system engineer that embeds this radar in a configuration with cameras and lidars, we generate raw sensor data through physics based sensor models, without caring about the chip internal dynamics. For system engineers that integrate the radar unit behind the bumper, we model high frequency electromagnetic effects, allowing optimization of bumper designs, materials and paints to ensure a good radar image. Finally, the path planning teams requires object level information from faster than real-time radar models. For all these engineers we offer the right fidelity modelling solution.

What are you hoping attendees will learn from Siemens at AutoSens Detroit Edition this year?

For me the best thing of AutoSens online events is the interaction that takes place during the presentations. I hope that people attending the different sessions feel free to share thoughts and pose questions with the other listeners, such that we can see new interactions and connections being established.

Shopping cart0
There are no products in the cart!
0
2024 ADAS Guide

The state-of-play in today’s ADAS market

With exclusive editorials from Transport Canada and SAE;  the ADAS Guide is free resource for our community. It gives a detailed overview of features in today’s road-going vehicles, categorized by OEM, alongside expert analysis.