USA | Europe

Tobii unveil a brand-new software library demo for camera-based DMS at InCabin

Having celebrated their 20th birthday in 2022, Tobii are a leading innovator in eye tracking, developing technology that understands human attention and intent through machine learning, artificial intelligence, and advanced signal processing.

They unveiled a brand new software library demonstration for camera-based DMS at InCabin last September. We caught up with the experts at Tobii to hear all about this software and all their latest advancements.

  1. You have solutions for both DMS and development, can you tell me more about how you support the OEMs who need to incorporate interior sensing? 

The automotive industry is one area in which attention computing is making waves, and we want to offer as much support as possible to manufacturers looking to incorporate a driver monitoring system into their design. To help reduce the cost of ownership, increase safety, and improve the user experience, the automotive industry can trust Tobii to offer:

  • A new software library for camera-based DMS that specifically targets the key challenges of cost, safety, and user experience. The InCabin event in Brussels on September 15th was our first public event showcasing this software, making it available for anyone to try out.
  • A unique full-stack competence to optimize cost and performance, in fact, we bring design-to-cost and industrialization experience from consumer electronics through to automotive. That means we can help our customers in the design phase of their own DMS camera.
  • Expertise and advanced methodologies for testing and validating the DMS, in particular for meeting the NCAP and regulation requirements. We understand current regulations and standards like the back of our hand, which means we can manage protocol definition, equipment setup, participant recruitment, analysis, and reporting – whatever the customer needs! Our recent acquisition of Phasya also boosts our ability to monitor physiological and cognitive states, supporting our eye tracking offering in the testing and validation process.
  • Ongoing innovation that enhances driver monitoring with data from sensing modalities other than eye tracking, ensuring a deeper understanding of the driver and support for companies integrating these systems into their vehicles.

It’s worth bearing in mind that the driver monitoring system is shifting from a premium product to a mandatory product across the automotive sector. In other words, this niche market is turning into a mass market.  New solutions better suited for budget cars are needed, and Tobii is filling this gap, but we also see the need for strong guidance and experiential support for manufacturers taking their first steps into the world of driver monitoring; we’re here to help!

2. How do you measure attention?

First things first, we must define what we mean by ‘attention’. Far from being a one-size-fits-all term, attention is a multi-factor state that encompasses many inputs, including drowsiness and visual distraction. This makes measuring attention a unique challenge.

Throughout our journey to deliver end-user value, we’ve broadened our capabilities and technology beyond eye tracking to what we call ‘Attention Computing’. Attention Computing recognizes the complexity of human attention and seeks to further empower machines to communicate with and understand humans in human terms. It is primarily powered by head and eye tracking but incorporates a range of sensor technologies that collect data such as eye and face imagery (from our cameras and illuminators), core signals from head and eye movement (eye gaze and head pose), and attention signals (like drowsiness and fixations).

But attention computing goes even further. We innovate attention monitoring in ways eye tracking can’t through a range of other sensing modalities. An example is in the growth in development of sensors used to monitor vital signs like heart rate and breathing patterns which, when combined with other attention data, creates a clearer picture of human behavior, attention, and state-of-mind.

3. Can you tell me more about how eye tracking developed into ‘attention computing’?

Though we started out as an eye tracking company, that doesn’t fully describe what we offer. In the 20 years since the development of our first eye tracker, we’ve been pursuing a deeper understanding of human behavior and creating technology that puts the insights we uncover to use for our customers.

Attention computing isn’t about one single measurement, it’s about making sense of how humans interact with their environment. It’s about measuring what we previously have been unable to measure by combining a range of highly accurate hardware, software, and analytical tools to create an image of attention and intent. Whether it’s inputs like gaze, head pose, position, presence, identification, drowsiness, or stress, attention computing takes the whole picture and empowers devices with the ability to respond to humans in a distinctly human way.

We still count eye tracking as one of our core technologies and greatest strengths, but as we continue to innovate, attention computing better captures our overarching mission, technological capabilities, and future as a company.

4. What do you see as the biggest challenge for companies who are assessing attention?

Assessing attention comes with a host of challenges. Here are the two main ones:

  1. Finding the relevant link between the use case and the attention data, as well as defining the performance requirements and the ground truth. When researching human attention, we don’t walk away with black-and-white answers, nor do we measure within black-and-white research scenarios. Defining the behavior associated with visual distraction while driving may seem simple, but it is far from it. The threshold at which something can be considered a distraction is perpetually in scientific discussion, and even experts setting the NCAP protocol struggle to agree on the baseline at which a DMS may alert the driver. It’s a complex topic and one that will continue to evolve.
  2. Ensuring that the solution works seamlessly for everyone in every potential environmental condition. At Tobii, we have two decades experience working with some of the biggest companies and research institutions in the world, so we’re used to chasing a high degree of accuracy and reliability in our solutions. We’ve learnt that making attention computing work inside a lab for a few sets of participants is relatively easy – the real challenge comes in the real world, in real conditions, amongst the global population. Our niche is in helping companies navigate this dramatic learning curve and make attention computing work for them in any situation.

5. What role do you play in UX Design and how does that impact in-cabin solutions?

We are proud to offer eye tracking solutions that support UX designers in many industries, including automotive. Tobii Pro Glasses 3 is our flagship wearable eye tracker that helps researchers and designers better understand how the driver interacts with their environment and the human-machine interface. This data provides previously unseen insights that empower better design decisions for the cockpit and the HMI. 

Recently we worked with SEAT to improve their UX design, specifically tracking driver attention to understand the influence of their infotainment systems on attention and distraction, with a view to improving safety and comfort in their vehicles. It’s an interesting case study and shows just how far attention computing can go to help UX designers achieve a high-quality finished product. 

If you like what you read, join us in Phoenix this March to find out more about Automotive interior intelligence.
Don’t miss out! https://auto-sens.com/incabin/pass

2024 ADAS GUIDE

The state-of-play in today's ADAS market

With exclusive editorials from Transport Canada and SAE;  the ADAS Guide is free resource for our community. It gives a detailed overview of features in today’s road-going vehicles, categorized by OEM, alongside expert analysis.