Big potential in automating interior functions
Senior manager for UI functions at Mercedes-Benz in Sindelfingen, Volker Entenmann is responsible for vision-based interior sensing systems, including UX, hardware and software development. He will be giving the AutoSens audience in Detroit a presentation on “MBUX Interior Assistant – The first step towards an intelligent interior.” In this interview with Volker, we talk to him about his role at Mercedes-Benz, the biggest challenges for vehicle perception and the importance of an intelligent interior.
What was your experience prior to Daimler?
I studied aeronautics and started my professional career as a systems engineer for flight guidance systems at Daimler Benz Aerospace. Then I joined Daimler passenger cars and had different positions in research, advanced engineering and series development. All of my work was related to technologies which support user interaction in the car.
Can you summarize what the MBUX Interior Assistant is?
The MBUX Interior Assistant is an interior camera system which is available in Mercedes-Benz GLE and CLA. It is our first step towards an intelligent interior. It reacts to the body language of the occupants and adapts seamlessly to what they already do. The MBUX Interior Assistant is the first system which provides global proximity detection for the touchscreen and the touchpad. It can reliably distinguish between driver and passenger and uses this capability to provide driver and passenger quick access to individual functions. And the MBUX Interior Assistant innovates light control. Driver and passenger can switch the reading light on and off by moving their hands up and down below the inside rear view mirror, and the MBUX Interior Assistant supports the driver with an automatic search light when reaching for an object on the passenger seat at night.
Why is having an intelligent interior an important focus for your group?
The MBUX Interior Assistant establishes a new category of in-cabin user experience. We want to create magic moments for our customers by just letting things happen – based on the overall context and natural body movements, without requiring specific knowledge or training from the customer. While the whole industry is focusing on autonomous driving, we see an equally big potential in automating interior functions to increase comfort and safety.
Watch latest videos from AutoSens on YouTube
► Deep learning processing technologies for embedded systems
► Multiclass road object detection for advanced driver assistance using deep neural networks
► Improving and implementing traditional computer vision algorithms…
What are the biggest challenges in your opinion for vehicle perception?
The features of the MBUX Interior Assistant need a very powerful image processing. The required accuracy and robustness can only be achieved with deep learning algorithms. Implementing them on an affordable embedded platform in a thermally constrained environment is the biggest challenge. We set up our own software development team at Mercedes-Benz Research & Development India, and our experts were successful in designing proprietary networks which run very efficiently and with low latency. Training those networks to unreleased car interiors in a production mode is another huge challenge, which we tackled by setting up our own AI development tool chain including data collection and annotation.
What are you most looking forward to about presenting your work at AutoSens?
I am excited that in-cabin applications are now part of the AutoSens conference. I am particularly looking forward to share our vision of an intelligent interior with the experts of the industry and to show that it will perfectly complement the automation of the driving task.
If you’d like to hear Volker present at AutoSens in Detroit, purchase your ticket here >>
Read latest news and updates from AutoSens
- In-cabin monitoring, HD mapping and Far Infrared complete the program themes at AutoSens in Detroit for 2019 ~ 11 February 19
- Big potential in automating interior functions ~ 8 February 19
- Chip-to-city thinking on sensors and mobility ~ 4 February 19