On Demand

From next year, Euro NCAP will, for the first time, start assessing direct driver state monitoring systems, as part of its 5-star rating programme.
As In-Cabin technologies evolve rapidly, Euro NCAP sees their potential in providing new angles to improve the life-saving performance of ADAS.
In this presentation, Euro NCAP will deep dive into the requirements included in the 2023 protocol, alongside an outlook for In-Cabin monitoring assessment as part of its 2030 Roadmap.

Hear from:

Adriano Palao

Adriano Palao
Technical Manager
ADAS & AD
European New Car
Assessment Programme


For multiple reasons, mainly end user driven, but also ecosystem / differentiation driven, focusing on safety for in cabin sensing is necessary, but not sufficient. This presentation will cover, the importance of user experience enablement in in cabin sensing, on top of regulatory needs.

Hear from:

Adrian Capata - Xperi

Adrian Capata
Senior Vice President
Engineering
DTS/XPERI


In a technology-dominated world there is an increasing need to bridge the gap between man and machine. Making our interactions with technology more seamless and intuitive will enable us to lead smarter, more productive, healthier and happier lives. Human Insight AI, that understands, supports and predicts human behavior in complex environments, can help us achieve this ambitious goal.
In this keynote Dr. Rana el Kaliouby will explore how human-centric AI delivers better automotive safety and more engaging mobility experiences that enhance comfort, wellness and entertainment. Drawing from her decades of work on Emotion AI, she will discuss the future of the in-cabin experience and the imperative to realize this in an ethical manner.

Hear from:

Rana headshot

Dr. Rana el Kaliouby
Deputy CEO
Smart Eye
Founder
Affectiva


What functionality is critical in terms of delivering safety and delivering great mobility experiences? What use-cases do we expect to be viable? How diverse are the requests for functionality and what will the added value be?

Hear from:

Egoi Sanchez

Egoi Sanchez
System Architect | Interior
Volvo Cars

Josef_STOCKINGER

Josef Stockinger
Sr. Technical Marketing
Manager, ADAS
STMicroelectronics

Verena Ihring

Verena Ihring
Senior Product Manager
Interior Monitoring Systems
Robert Bosch

Junko Yoshida COLOUR

Moderated by
Junko Yoshida
Editor-in-Chief
The Ojo-Yoshida Report


A family of IPs able to precisely predict in real-time the sleep onset of subject through the analysis of few selected physiological parameters have been developed over the years, in order to run both on contact (wearable) and contactless (RADAR) systems.
First results coming from Reduced Wakefulness Maintenance Test confirmed the prediction capability of the IP; more intensive validation tests are planned soon at the Dallara F1 dynamic vehicle simulator.
In this talk, we will present the key concepts and the preliminary results.

Hear from:

Riccardo Garcia

Riccardo Groppo
CTO and Founder
Sleep Advice Technologies


To overcome the limitations of contact sensors, computational psychophysiology based on thermal infrared (IR) imaging has been assessed as a solution for the quantitative evaluation of several parameters associated with ANS activity.
The possibility to assess human factors through non-contact technology makes thermal IR imaging perfectly suitable for in cabin monitoring. Recently, we developed machine learning-based models to classify driver’s stress, drowsiness, thermal comfort, and cognitive workload.
In this talk, we will show the latest advancements in the field.

 

Hear from:

Arcangelo Merla

Arrcangelo Merla
Professor, Biomedical Engineering
G. d’Annunzio University
of Chieti-Pescara

Founder
Next2U

 

 


This talk will explore the progress of 3D Time-of-Flight and 60 GHz Radar as complementary technologies addressing in-cabin use-cases and being the key enabler for differentiating features. See examples of secure 3D face authentication added to standard DMS functionality or a highly optimized and cost-efficient seat occupant detection solution including CPD and intrusion alert.

Hear from:

Martin Lass

Martin Lass
Senior Product Marketing Manager Time-of-Flight
Infineon


By using higher frequencies (140 GHz) and digital modulation, radar performance can be brought to new levels, and to smaller form factors. Besides better radars, imec is taking the lead in event-based sensor fusion between passive and active sensing modalities, paving the way to highly energy efficient and fast perception for improved safety. In this talk, we will present opportunities and challenges related to these new radar and sensor fusion technologies.
Presented by imec

 

 

Hear from:

Li Huang

Dr. Li Huang
Senior Business Development Manager
imec


With multiple-input multiple-output (MIMO) mode, FMCW radars can specify distance, angle of arrival (AoA) and velocity information of multiple targets simultaneously, which is critical for in-cabin applications. This paper presents a sensor that performs the three safety functions using the compact Infineon BGT60ATR24C radar chip.

Hear from:

Milan Stojanovic

Milan Stojanovich
Radar Algorithms and
Software Engineer
Novelic


The presentation will describe a multi-level and modular approach for validating DMS that enables more adaptive and affordable processes and that comply with regulation. We will discuss the testing methodology as well as how to find metrics that can help automotive industry to focus on supporting the desired use case including concrete examples and suggestions of metrics. We will also explain how this approach can help automotive industry to become more agile in structuring the DMS architecture for future generations.

Hear from:

Clementine Francois

Clementine Francois
Engineering Manager
Tobii


Nowadays, interior monitoring camera has gathered more interest to bring better safety with additional convenience features. While driver monitoring is basic option, interior monitoring is expected to bring the additional customer value. LGE has developed core perception technologies and those applications for interior monitoring solutions such as human body pose, seat belt and leaving object detection. During development, we identified a lot of technical challenges for those features to bring the matured performance for end customer. These challenges include DB collection, model training and application. In this presentation, we briefly introduce LGE interior monitoring solution and discuss some typical challenges in real life scenarios.

Hear from:

Youngkyung Park

Youngkyung Park
Director of Automotive Vision System Development
LG Electronics

JungyongLee

Jungyong Lee
Machine Learning Engineer
LG Electronics


If done correctly, drowsiness can be quantified from its early stage through to the late stages. There is a biological based change that occurs in the transition from alertness to sleep in everyone. This drowsiness indicator has been proven independently in multiple sleep studies.

In this presentation we will discuss the recent convergence of technology and the changes in regulatory and safety landscape that now makes it possible to measure drowsiness in passenger vehicles.

We will describe the importance of the right “Ground Truth” in the development of any drowsiness detection system.

The benefits of objective measurements of drowsiness extends beyond a person’s state of mind at a single point in time. Recent advancements in cameras, image processing, facial feature extraction, and Advanced Driver Assistance Systems (ADAS) have made it possible to interact with the driver and their environment directly. In this presentation, we will also describe a multi-stage approach to the application of countermeasures. This ensures driver comfort is preserved while prolonging the duration of safe driving by introducing tiered levels of countermeasure intensity that is best suited to remedy the various stages of driver drowsiness.

Hear from:

trefor_morgan

Dr Trefor Morgan
GM R&D
Optalert


Details tbc

 

Hear from:

Philippe Dreuw

Dr. Philippe Dreuw
Chief Product Manager Interior Monitoring Platform
Robert Bosch


Activity recognition is a crucial part of In-Cabin Sensing. It not only helps detect if an occupant is inattentive, it also helps gain a full scene understanding of what is happening in a car. This enables a variety of safety, comfort, wellness and entertainment functions.

Robust activity recognition in a car requires the fusion of different low-level algorithms that each have their unique strengths. This talk will give insight into the steps involved in developing an activity recognition system — from data collection and annotation to integration on a System on Chip. Mona will talk about the challenges of doing activity recognition.  How does one define the boundaries of an activity on a temporal basis? Mona will show examples from activity recognition modules that she has developed and is looking to work on in the future.

Hear from:

Mona Beikirch

Mona Beikirch
Senior Research Engineer
Smart Eye


Closed loop requirements definition is a key concept to avoid over-designing system sub-components. For improved system performance, minimized energy consumption and environmental impact, it is crucial to enable system architects with the tools to evaluate various hypothesis in terms of sub-system components. This presentation will introduce sensor simulation as a tool to support closed loop requirements definition. A framework is proposed to introduce physically consistent noise and other non-idealities to real-life or synthesize images. These images may be introduced in a system model to evaluate the impact of sensor noise, QE and MTF on such KPI as eye gaze detection accuracy.

Hear from:

Tomas Guerts

Tomas Geurts
Senior Director
OMNIVISION


Current 2D IR DMS systems (2022) and upcoming next generation 2D RGB/ IR OMS systems (≈2025) are expected to rely fully on 2D information. This is a good first step for OEMs to integrate a camera inside the car to monitor head and eye state to alert the driver in case of drowsiness or lack of attention.

Sony believes that in the future, advanced occupant state and context awareness will be needed to achieve a safer in-cabin.

Such a system will allow to detect individual seating positions or out of positioning, diverse occupant characteristics like body volume, body size, age or gender, as well as proximity to airbags/ headrest/ steering wheel. This information will enable personalized and situation-aware safety systems that enhance safety for all occupant categories and not only a predefined category.

Relying only on 2D sensing for safety critical features, is in Sony’s opinion not sufficient to achieve the availability and reliability requirement for a future safe in cabin. Instead, redundancy and several modalities are necessary and will be discussed in this presentation.

Hear from:

Jan-Martin Juptner

Jan-Martin Juptner
Business Development Manager – Automotive
Sony Depthsensing Solutions


How can we develop safe systems (e.g.to tackle driver inattention, driver misuse) as well as enhance user experience with ADAS features operation? What is next for HMI? How can appropriate feedback be given such that these systems are seen to be helpful? What is the best way to interface with the driver?

Hear from:

Modar Alaoui colour

Modar Alaoui
Founder and CEO
Eyeris

Naizghi-Esayas-2022

Esayas Naizghi
Senior Director, Hardware and Software Solutions
indie Semiconductor

Annika Larsson

Annika Larsson Research Adviser Predevelopment
Arriver

Junko Yoshida COLOUR

Moderated by
Junko Yoshida
Editor-in-Chief
The Ojo-Yoshida Report


Scroll to Top

Your Schedule

Below are your bookmarked sessions. Click on the bookmark icon to remove session from your schedule. You may need to refresh the page to see the latest updates.

You don't have any saved sessions yet. Please go back to the Agenda and start building your schedule.