Interview with Senior Behavioural Data Scientist, Dr Dominic Noy

Dr Noy, a Senior Behavioural Data Scientist at Humanising Autonomy joins the AutoSens agenda for the Detroit edition (12-13 May) to deliver a keynote address on the limitations of camera-based perception. He took some time out to talk to us about his background, challenges facing the industry and what he hopes to cover at AutoSens.

You have a PhD in Experimental Psychology (Psychophysics, Cognitive Modelling, Human-Computer Interaction), a Master’s in Statistics, and a Master’s in Psychology (Psychology of Action), how did you wind up working in automotive?

I was always fascinated by mathematics as a way to better understand the human mind. With quantitative modelling tools from the behavioural sciences, statistics and machine learning, we’re able to push beyond the boundaries of traditional mathematics-based models to better interpret human action and interaction. I modelled how humans control their walking pattern for my PhD, and how this is affected by the environment and social interactions. At the time – I wasn’t really thinking about the automotive sector – though in hindsight there’s obvious crossover.

Dominic Noy

Dr. Dominic Noy, Senior Behavioural Data Scientist at Humanising Autonomy. Delivering a presentation on “Beyond physics: Tackling the limitations of camera-based perception.”

Owing to the emergence of increasingly powerful machine learning algorithms, we’re about to see an explosion of semi-autonomous mobility systems. As AVs proliferate cities, the problem shifts to become one of interaction. Now, autonomous systems will need to extract meaningful information from vast sensor input. While machine learning algorithms learn from this data, psychological tools usually use data to validate hypotheses about the underlying cognitive mechanisms.

My work at Humanising Autonomy allows me to merge both approaches to build transparent machine learning models that help us understand and predict human behaviour. I expect that the ability to reliably predict human behaviour is going to have a disruptive impact on how we can use mobility systems (AVs, ADAS) in urban environments.

We all know that automated systems need to better understand people, do you think this is an easy challenge to solve? Would you say it was one of the biggest challenges in achieving autonomous driving?

We won’t understand every nuance of human behaviour in every situation – this is an impossible problem to solve.  What we can do is better understand the important factors that underlie human-vehicle interaction to make autonomous systems function safely and smoothly. It is certainly a challenge to identify these factors and extract them from sensor input; but the bigger challenge is to build systems that are robust and trustworthy. False predictions in an urban environment can be fatal. But just as people aren’t able to predict with 100% accuracy what will happen on the roads, neither will machines, particularly when it is crowded, nighttime, rainy, foggy, or snowy.

In contrast to most systems, attentive human drivers are aware of their uncertainties and can adjust their behaviour accordingly, e.g. by slowing down. Replicating these capabilities of estimating one’s own uncertainty in a reliable manner is the biggest challenge.

How do automotive engineers generally respond when they hear about your approach?

With interest! Many engineers in this space already experience difficulties around diverse pedestrian behaviour, but it’s too far down the priority list to tackle first. Time and resources are spent elsewhere, not on solving challenges around pedestrian-vehicle interaction. Systems on the road today use similar prediction models for different objects, which only work for humans in 95% of all cases, when averaged across environments. But this is not the case for city centres where such models fail heavily.

If companies don’t start tackling this problem, we won’t see AVs in pedestrian-dense environments anytime soon.  Passengers won’t accept a system that abruptly decelerates, accelerates and stops due to false predictions as it could cause motion sickness, not to mention unnecessary emergency braking could be dangerous for passengers and other road users.

Without giving too much away what do you hope the AutoSens engineers take away from your presentation in May?

By showing some concrete examples, I want to make clear that predicting pedestrian behaviour isn’t possible with current approaches. I will highlight that we need to extract more meaningful information about the pedestrian to put the pedestrian first when developing models to enable a smooth and safe ride in cities.

Join Dr. Dominic Noy, Senior Behavioural Data Scientist at Humanising Autonomy and watch his presentation on “Beyond physics: Tackling the limitations of camera-based perception.”

Shopping cart0
There are no products in the cart!
0
2024 ADAS Guide

The state-of-play in today’s ADAS market

With exclusive editorials from Transport Canada and SAE;  the ADAS Guide is free resource for our community. It gives a detailed overview of features in today’s road-going vehicles, categorized by OEM, alongside expert analysis.