Please note: Session times and locations are subject to change
Check-in / exhibition opens
- Wednesday 15th March
- 5:00pm MDT
Roundtable discussions
- Wednesday 15th March
- 5:30pm MDT
- Mezzanine Stage
Roundtable discussions
Free to attend discussion sessions included with your ticket.
Our roundtable discussions give attendees the chance to come together over a beer to discuss a hot industry topic prior to the main conference content starting. Led by an industry expert, these are off-the-record informal group chats, in which you can take an active or passive role. Previous topics discussed include “What is actually relevant, in terms of driver state sensing?” “What matters the most, in real-life, for real-world applications?” “What are the requirements to meet and keep five-star ratings?”
- 6:00pm MDT
- Wednesday 15th March
Welcome reception in exhibition sponsored by Veoneer

Check-in / exhibition opens
- Thursday 16th March
- 8:30am MDT

Hayley Sarson,
Operations Director,
Sense Media Group
Building an immersive car and enhancing the user experience – the road ahead
- Thursday 16th March
- 9:30am MDT
- Virginia G. Piper Theater
Building an immersive car and enhancing the user experience – the road ahead
The presentation will address present and future demands of the automotive in-cabin consumer experience. It will talk on in-cabin monitoring – driver and occupant monitoring, explore on related functions such as cognitive distraction, stress-free routing, and personalized experiences, and will also discuss supporting technologies like artificial Intelligence / machine learning and sensor technology such as camera and radar sensors.

Dr. Peter Amthor,
Chief ADAS/AD Technology and Innovation Expert,
Harman

Dr. Peter Amthor,
Chief ADAS/AD Technology and Innovation Expert,
Harman
Can a driver’s attention be successfully divided?

Dave Mitropoulos-Rundus,
Senior Engineer,
Hyundai American Technical Center
Safety legislation – is a federal regulatory framework within reach? Exploring latest developments
- Thursday 16th March
- 10:30am MDT
- Virginia G. Piper Theater
Safety legislation – is a federal regulatory framework within reach? Exploring latest developments
- 11:00am MDT
- Thursday 16th March
Networking refreshment break, sponsored by Optalert

- 11:15am MDT
- Thursday 16th March
Press briefing (Contact us to attend as media)
ADAS and AV sensor suite: emerging trends and developments
- Thursday 16th March
- 11:45am MDT
- Virginia G. Piper Theater
ADAS and AV sensor suite: emerging trends and developments
This presentation will cover Tech Insights’ detailed analysis and forecasts for the ADAS and Autonomous Vehicle (AV) sensor suite. It will highlight the challenges facing the industry that include bringing safety solutions to mass-market vehicles at a low price point while also expanding on autonomous convenience functions on the path to autonomous vehicles. The industry is challenged by introducing low-cost solutions to comply with NCAP requirements for mass-market vehicles while also developing L2+ systems and reducing the cost of the overall ADAS / AD sensor suite.

Mark Fitzgerald,
Director, Autonomous Vehicle Service,
Strategy Analytics

Mark Fitzgerald,
Director, Autonomous Vehicle Service,
Strategy Analytics
What’s next for in-cabin? Future outlooks for technology, industry & regulation
- Thursday 16th March
- 12:15pm MDT
- Virginia G. Piper Theater
What’s next for in-cabin? Future outlooks for technology, industry & regulation
Future outlooks for technology, industry, regulation. How is the industry changing and how can we prepare for this? Are further industry collaborations the way forward?

Junko Yoshida,
Co-Founder & Editor-in-chief,
The Ojo-Yoshida Report
(Moderator)

Nandita Mangal,
Platform Owner - HMI Vehicle Experience,
Aptiv
(Panellist)

Allen Lin,
Technical Specialist – Driver Monitoring, Night Vision, and Interior Camera Systems,
General Motors
(Panellist)

Caroline Chung,
Engineering Manager,
Veoneer
(Panellist)

Junko Yoshida,
Co-Founder & Editor-in-chief,
The Ojo-Yoshida Report
(Moderator)

Nandita Mangal,
Platform Owner - HMI Vehicle Experience,
Aptiv
(Panellist)

Allen Lin,
Technical Specialist – Driver Monitoring, Night Vision, and Interior Camera Systems,
General Motors
(Panellist)

Caroline Chung,
Engineering Manager,
Veoneer
(Panellist)
- 1:00pm MDT
- Thursday 16th March
Networking lunch break, sponsored by STMicroelectronics

Mitigating unsafe driving situations using interior monitoring systems
- Thursday 16th March
- 2:15pm MDT
- Virginia G. Piper Theater
Mitigating unsafe driving situations using interior monitoring systems
Interior monitoring can enhance safety, comfort, and convenience for all vehicle occupants. Today these systems can detect driver distraction, signs of drowsiness, and even whether a child has been left behind in the vehicle and provide alerts the driver to these critical situations. By using some of the same base signals that are used to detect the functions defined above, we may be able to effectively reduce other unsafe driving behaviors. This may include driving while under the influence of alcohol or perhaps driving a vehicle when being subject to sudden sickness. These interior monitoring systems will be critical for determining if a driver is able to perform the dynamic driving task.

Fabiano Ruaro,
Product Manager for Interior Monitoring Systems,
Bosch

Fabiano Ruaro,
Product Manager for Interior Monitoring Systems,
Bosch
Supporting the DMS on existing ECU with extendable AI accelerator module
- Thursday 16th March
- 2:15pm MDT
- Stage 2
Supporting the DMS on existing ECU with extendable AI accelerator module
Interior monitoring can enhance safety, comfort, and convenience for all vehicle occupants. Today these systems can detect driver distraction, signs of drowsiness, and even whether a child has been left behind in the vehicle and provide alerts the driver to these critical situations. By using some of the same base signals that are used to detect the functions defined above, we may be able to effectively reduce other unsafe driving behaviors. This may include driving while under the influence of alcohol or perhaps driving a vehicle when being subject to sudden sickness. These interior monitoring systems will be critical for determining if a driver is able to perform the dynamic driving task.

Darren Chen,
Head of Automotive Technology Initiative,
LITE-ON Technology Corp.

Darren Chen,
Head of Automotive Technology Initiative,
LITE-ON Technology Corp.
Body height and weight estimation of vehicle occupants
- Thursday 16th March
- 2:40pm MDT
- Virginia G. Piper Theater
Body height and weight estimation of vehicle occupants
In order to minimize the injury of occupants during a crash, an intelligent adaptive occupant restraint system can use the information about body height and weight of each occupant to improve the safety of each one individually. Both information can be obtained with an interior camera that is normally used for driver monitoring tasks. The proposed method makes a first height prediction based on the vehicle’s occupant face only and fuse its result with further body height-related features. Subsequently, the weight estimation approach uses the height prediction and fuse this result with further body weight related features. This method embraces the utilization of convolution neural networks and a machine learning-based regression.

Patrick Laufer,
Development engineer,
IAV Vehicle Safety

Patrick Laufer,
Development engineer,
IAV Vehicle Safety
InCabin Monocular 3D Sensing: from Current Challenges to Future Opportunities
- Thursday 16th March
- 2:40pm MDT
- Stage 2
InCabin Monocular 3D Sensing: from Current Challenges to Future Opportunities
This session covers the latest techniques, challenges and opportunities in implementing InCabin monocular 3D sensing using 2D image sensors. Enabling a new category of use cases for enhanced safety and optimized comfort, this session will further cover how efficient inference on AI chips is capable of generating new types of depth-aware data and new monetization models inside this third living space

Modar Alaoui,
Founder and CEO,
Eyeris

Modar Alaoui,
Founder and CEO,
Eyeris
Reliable in-cabin awareness for occupant safety via multi-sensor fusion
- Thursday 16th March
- 3:05pm MDT
- Virginia G. Piper Theater
Reliable in-cabin awareness for occupant safety via multi-sensor fusion
Current 2D IR DMS systems and upcoming next generation 2D RGB/ IR OMS systems (≈2026) are expected to rely fully on 2D information. This is a good first step for OEMs to integrate a camera inside the car to monitor head and eye state to alert the driver in case of drowsiness or lack of attention.
Sony believes that in the future, advanced occupant state, posture and context awareness will be needed to achieve a safer in-cabin for all occupants. Therefore several sensors need to be fused, to not only cover the monitoring of the driver, but understanding the activity and behavioral context for the driver and all passengers. With this understanding the active safety systems step up in performance and passive safety systems can be optimally supported, such as active restraint systems.
In the talk Sony describes the necessity of sensor fusion, confirms the statement by research and shows ways how to enable safe in-cabins

Jan-Martin Juptner,
Business Development Manager – Automotive,
Sony Depthsensing Solutions

Jan-Martin Juptner,
Business Development Manager – Automotive,
Sony Depthsensing Solutions
Making use-for-good of in-cabin data
- Thursday 16th March
- 3:05pm MDT
- Stage 2
Making use-for-good of in-cabin data
Today’s embedded sensor technology and location APIs make the car the perfect incubator for a new standard in infotainment. Trip Lab is developing responsive in-cabin experiences powered by Empathetic AI. Our Mission: to create digital health interventions that help drivers feel better when they step out of the car than when they got in. This presentation will showcase real use case examples for improved wellbeing in the car; driven by the ability to sense, understand and support any state of driving.

Alex Wipperfürth,
President,
Trip Lab

Alex Wipperfürth,
President,
Trip Lab
- 3:30pm MDT
- Thursday 16th March
Networking refreshment break
In-cabin applications: towards an intelligent automotive interior
- Thursday 16th March
- 4:15am MDT
- Virginia G. Piper Theater
In-cabin applications: towards an intelligent automotive interior
In-cabin applications are becoming more complex, offering more connectivity, monitoring, and more personalization. Driver and occupant monitoring applications are the main applications due to their integration in future regulations. Several technical developments have been observed related to 2D and 3D vision for driver monitoring. For occupant monitoring, radar technology can provide another solution for Tier-1s. But, in-cabin applications are not limited to these applications. Interior lighting is growing rapidly from functional lighting to a more personalized lighting and could even be used for communication and safety applications. Air quality monitoring is still in its infancy but PM2.5 sensors based on the light scattering principle start to be implemented by some OEMs like Volvo. In-cabin applications require more sensors, more computing and therefore the car architecture will be impacted. Moving from a distributed to a more centralized architecture with domain controllers will offer to OEMs the possibility to monetize software services.

Pierrick Boulay,
Senior Analyst - Lighting and ADAS systems,
Yole Intelligence

Pierrick Boulay,
Senior Analyst - Lighting and ADAS systems,
Yole Intelligence
A versatility approach to the cybersecurity of the in-cabin image sensors
- Thursday 16th March
- 4:15pm MDT
- Stage 2
A versatility approach to the cybersecurity of the in-cabin image sensors
Connectivity is increasingly being integrated and expanded into newer vehicle models to improve efficiency and safety for people and property. This high level of connectivity poses increasing risks for external attacks on vehicle systems that automotive manufacturers must protect against. Established cybersecurity methods are being leveraged and brought to automotive architectures to provide increased security. New generation in-cabin sensors will need to support cybersecurity features to establish secure channel link between the sensor and host through mutual authentication and provide video stream integrity. However, the level of protection required is not clearly defined. The lack of a standard imposes a flexibility of cybersecurity features on the image sensor which we will discuss.

Charles Kingston,
Senior Imaging Application Development Engineer,
STMicroelectronics

Charles Kingston,
Senior Imaging Application Development Engineer,
STMicroelectronics
How do lighting technologies and characteristics cross-over with in-cabin safety, comfort and applications?

Philippe Aumont,
General Editor,
DVN Interior
(Moderator)
Emerging Dynamic Vision Sensor Technology Enhances Driving Safety and Privacy in Smart Cockpit
- Thursday 16th March
- 4:40pm MDT
- Stage 2
Emerging Dynamic Vision Sensor Technology Enhances Driving Safety and Privacy in Smart Cockpit
Driver monitoring systems (DMS) are crucial to driver safety. Traditional DMS systems use RGBIR sensors and AI models to detect driver states in real-time. However, these systems are sensitive to image illumination, contrast, and dynamic range, and could also raise privacy and security concerns due to the use of in-cabin cameras. In this session, we present an event-based DMS with dynamic vision sensors (DVS) to overcome the limitations above. We will discuss the properties of DVS and showcase how we integrate DVS into edge device along with developed event-based AI models, achieving robust DVS DMS with high privacy.

Kai-Chun Wang,
Machine Learning Engineer,
iCatch Technology

Kai-Chun Wang,
Machine Learning Engineer,
iCatch Technology
- 5:45pm MDT
- Thursday 16th March
Main Drinks Reception sponsored by DTS/Xperi

Check-in / exhibition opens
- Friday 17th March
- 8:30am MDT
Health and impairment detection through cognitive states monitoring – development and validation challenges
- Friday 17th March
- 9:15am MDT
- Virginia G. Piper Theater
Health and impairment detection through cognitive states monitoring – development and validation challenges
Driver impairment is a growing concern for automotive industry. This presentation will cover how to develop solutions to detect health and impairment of drivers through physiological and cognitive states monitoring (e.g., cognitive load, stress). We will discuss the challenges to develop and validate such solutions, covering which data to use, which sensors, and which ground truth. We will also describe the relevant use cases of such solutions for in cabin focusing on safety, well-being, and user experience.

Clémentine François,
Biomedical Engineering Manager,
Tobii AB

Clémentine François,
Biomedical Engineering Manager,
Tobii AB
The State of Intoxication Research: The Opportunities and Challenges
- Friday 17th March
- 9:40am MDT
- Virginia G. Piper Theater
The State of Intoxication Research: The Opportunities and Challenges
For as long as cars have existed, drunk driving has been a complex problem in need of comprehensive solutions. Different strategies for mitigating intoxicated driving have been deployed for over a century, yet none of them have proven to be very effective. Critical research is being conducted by government bodies, the automotive industry, technology companies and academia to determine better approaches.
Driven by technical enhancements in Artificial Intelligence and global regulatory efforts, Driver Monitoring Systems are fast becoming a leading human-centered automotive safety system. Initially focused on detecting distracted and drowsy driving, these systems may offer keys to detecting and mitigating driver impairment due to intoxication.
In this presentation we will explore the state of alcohol intoxication research and opportunities to leverage evolving driver monitoring (DMS) technology to enhance road safety – all drawing from Smart Eye’s existing research collaborations.

Detlef Wilke,
Vice President of Automotive,
Smart Eye

Detlef Wilke,
Vice President of Automotive,
Smart Eye
Radar based vital signs monitoring: how it works and why you want to use it
- Friday 17th March
- 10:05am MDT
- Virginia G. Piper Theater
Radar based vital signs monitoring: how it works and why you want to use it
Radar can be used to measure heart rate via exquisite measurement of skin vibration during every heartbeat and respiration rate by measuring the movement of the chest cavity at each breath. Measurement of these vital signs offers a wealth of information beyond what can be gleaned visually with conventional camera-based driver monitoring systems. For example, one may be visually calm while experiencing great stress. But heart rate and respiration rate changes reveal what even the most stoic individual is experiencing. Additionally having radar in-cabin can also offer child presence detection, occupancy detection, and other features at little to no additional cost. In this talk we will cover the basics of how radar based VSM works, what features it enables, how one could use it to enhance automotive safety and convenience, and where it beats cameras for in-cabin monitoring

Harvey Weinberg,
Director Sensor Technologies,
Microtech Ventures

Harvey Weinberg,
Director Sensor Technologies,
Microtech Ventures
- 10:30am MDT
- Friday 17th March
Networking refreshment break
From drowsiness monitoring to sleep prediction: a scalable architecture
- Friday 17th March
- 11:15am MDT
- Virginia G. Piper Theater
From drowsiness monitoring to sleep prediction: a scalable architecture
A family of IPs, named PREDICTS, able to precisely predict in real-time the sleep onset of a subject through the analysis of few selected physiological parameters have been developed over the years, in order to run both on contact (wearable) and contactless (RADAR) systems.
A detailed validation phase run in different incremental steps with a relevant number of subjects on overnight tests and on the dynamic vehicle simulator at AVL (Graz).
Since PREDICTS is “sensor agnostic” a scalable and flexible architecture has been defined in order to successfully address both after-market and OEM applications.
A pilot application, combining a camera-based system and a wearable device on a fleet of heavy-duty trucks in collaboration with Garmin, who is providing the sensing technology, is in progress. It is among the first validations in the transport/logistics field.
In this talk, we will present the key concepts and the preliminary results.

Riccardo Groppo,
CEO & Founder,
Sleep Advice Technologies

Riccardo Groppo,
CEO & Founder,
Sleep Advice Technologies
Optics still matter in the Future!
- Friday 17th March
- 11:15am MDT
- Stage 2
Optics still matter in the Future!
The InCabin revolution will expand from the initial focus on DMS to many vision-based applications, and the presentation will take a hands-on approach and discuss future InCabin vision applications through the lens of an actual lens manufacturer. The optical stack is often seen as a commodity nowadays, even though the lens is a crucial driver, differentiator, and innovator for performance. Applying practical optimization and tradeoff strategies on the lens level supports the development of a differentiated camera product that can satisfy regulations and automotive requirements on the one hand and balance costs on the other.

Ingo Foldvari,
Director of Business Development,
Sunex Inc.

Ingo Foldvari,
Director of Business Development,
Sunex Inc.
Vision based AI for drowsiness detection – a high risk approach to a solved problem
- Friday 17th March
- 11:40am MDT
- Virginia G. Piper Theater
Vision based AI for drowsiness detection – a high risk approach to a solved problem
Vision-based AI is a dangerous approach for drowsiness detection. This presentation covers several risks and pitfalls automotive suppliers will encounter. These stem from misconceptions and incorrect assumptions about the nature of drowsiness. We present an alternative approach that is based on a physiological biomarker for drowsiness using eyelid movements. This methodology can easily be incorporated into all vision-based driver monitoring systems, today. In addition, we will discuss other neurological biomarkers that can be derived from eyelid movement, such as onset of Alzheimer’s, epilepsy and neurotoxicity.

Dr. Trefor Morgan,
GM R&D,
Optalert

Dr. Trefor Morgan,
GM R&D,
Optalert

Per Fogelström,
Principal system architect automotive,
Tobii
Drowsiness Prediction Based on an iToF Sensor for iCM Applications
- Friday 17th March
- 12:05pm MDT
- Virginia G. Piper Theater
Drowsiness Prediction Based on an iToF Sensor for iCM Applications
Nowadays most drowsiness detection systems are based on visible drowsiness manifestations e.g., yawning, closing eyelids, etc. and therefore have a very limited time window to act. We present here a system based on breathing that detects drowsiness several minutes before any visible drowsiness manifestation appears. Breathing is captured through a 3D iToF sensor in the interior of the vehicle.
The sensor measures the movements in the Z-axis of the driver’s chest corresponding to the breathing signal. Thoracic Effort Drowsiness Detection algorithm (TEDD) analyzes this signal to predict several minutes in advance if the driver will become impaired to drive due to drowsiness.
The proposed DMS shows promising results regarding the breathing signal extraction. When compared to a chest band we obtain a Pearson’s Correlation Coefficient (PCC) of 0.98. Regarding drowsiness prediction we obtain the exact same results for both systems.

Brenda Meza García,
Innovation Leader,
Nextium by IDNEO

Brenda Meza García,
Innovation Leader,
Nextium by IDNEO
Optical Design Technology for In-Cabin Imaging System
- Friday 17th March
- 12:05pm MDT
- Stage 2
Optical Design Technology for In-Cabin Imaging System
Today’s automotive technology is advancing at an unprecedented rate. User experience and satisfaction plays a critical role in automotive system development, especially regarding in-cabin system requirements. Due to this, vehicle imaging systems are moving away from the narrow field of view driver monitoring system towards a wider Field of View (FoV) to capture the entire cabin (occupant & back seat). This means combing the mandatory Driver Monitoring System (DMS) with more modern user experiences such as occupant identification and video surveillance, in one single sensor that reduces cost and impact on interior design.
In this session we will discuss how advanced freeform lens with smart pixel management maximise the resolution on the driver while capturing the entire cabin in varying lighting conditions. We will show how advanced image processing algorithms including dewarping deliver the best pixels for either human or computer vision to maximize the application efficiency.

Patrice Roulet-Fontani,
VP Technology & Co-Founder,
Immervision

Patrice Roulet-Fontani,
VP Technology & Co-Founder,
Immervision
- 12:30pm MDT
- Friday 17th March
Networking lunch break
HMI Trends, technology for developing seamless cockpit user experiences in the era of domain controllers
- Friday 17th March
- 1:30pm MDT
- Virginia G. Piper Theater
HMI Trends, technology for developing seamless cockpit user experiences in the era of domain controllers
As the industry moves towards to consolidation of ECUs and electrification, the need for differentiation between brands has driven the focus towards the user experience in the cockpit. This move throws up several challenges in developing a coherent, seamless user experience that’s very different to your mobile phone. The experience is a mix between what connected TV offers on top of the smart phone you carry. In this session we talk about trends, developing HMI for very discrete system in a seamless cockpit of the future and the vision of gamified and personalised cockpit.

Adrian Capata,
Senior Vice President, Engineering,
DTS/Xperi
Imagining the Immersive User Experience
- Friday 17th March
- 2:30pm MDT
- Virginia G. Piper Theater
Imagining the Immersive User Experience

Junko Yoshida,
Co-Founder & Editor-in-chief,
The Ojo-Yoshida Report
(Moderator)

Colin Barnden,
Principal Analyst – Automotive ADAS and In-Cabin Monitoring,
Semicast
(Panellist)

Prateek Kathpal,
CTO,
Cerence
(Panellist)

Junko Yoshida,
Co-Founder & Editor-in-chief,
The Ojo-Yoshida Report
(Moderator)

Colin Barnden,
Principal Analyst – Automotive ADAS and In-Cabin Monitoring,
Semicast
(Panellist)

Prateek Kathpal,
CTO,
Cerence
(Panellist)