Pioneering Image Quality Standards in the Automotive Industry with Brian Deegan, University of Galway

Brian Deegan from the University of Galway will be joining us at AutoSens Europe with an insightful tutorial on “Camera Image Quality Assessment for Automotive Perception Systems”. Find out more about his experience in Image Quality, the key Image Quality metrics, and the evolution of these metrics
1. Journey and interest in image quality 

My career path has been pretty non-linear, I’ve bounced around a bit. My undergraduate degree was in Computer Engineering, and from there, I did a PhD in Biomedical Engineering. My thesis was on blood flow regulation in the brain, and how the vascular system works to keep a constant blood flow in the brain, despite rapid changes in blood pressure. I finished up in 2011, right at the peak of the recession, and needed a job. I knew a guy in Valeo, sent in my CV, and the rest is history! 

I spent the next decade working on image quality for automotive camera systems. In that time, the field evolved beyond all recognition. The first cameras I worked on were relatively simple driver assistance cameras. Over time, systems become more and more complex – HDR sensors, multi-camera systems, cameras with machine vision, mirror replacement systems, interior cameras, autonomous driving cameras etc. Lots of challenges, lots of fun along the way! 

I first started getting involved with standards, particularly P2020, because I was constantly frustrated by the lack of suitable image quality standards for automotive camera systems. The existing image quality standards were largely based on consumer photography and security requirements. These standards were not applicable to the fisheye, high dynamic range camera systems I was working on.  

Things reached a head around 2016 or so, when the real push towards self-driving cars took off. OEMs were making claims about having self-driving cars on the road by 2020. Those of us working on image quality recognised the need for automotive specific standards, and this led to the development of the P2020 standard and community. 

I got particularly interested in standards for LED flicker because I had seen many demos claiming to have solved LED flicker with methods and approaches that could only work in very specific circumstances (if at all). For me, this was very concerning. On one side, there was a lack of understanding of the underlying principles that cause LED flicker, and there was no standard method for testing. Thanks to the work of the P2020 Flicker sub-group, the industry is now well-informed regarding the causes, limitations, and viable mitigation strategies. I’m very proud of the work I did with Bob Black and the rest of the Flicker sub-group, it’s been a career highlight for me. 

2. Key aspects of image quality metrics 

One of the big challenges in recent years is quantifying how camera image quality affects perception algorithms (such as pedestrian detection, traffic sign recognition, lane marking recognition etc). Historically, camera image quality standards largely focused on photographic considerations i.e. how to make an image look “good” to a human. Instead, I will focus on how to make camera images look “good” to AI perception systems.  

In this tutorial, I will focus on automotive camera design, performance characterization, and real-world considerations. To give an example, consider a customer requirement that states “The camera system must be able to read the text on a road sign at a distance of 100m, at night time”. How do you define the minimum performance and functional requirements for such a camera system? And how can you validate the performance? 

This last part is tricky – how do you separate camera performance from AI algorithm performance? Do you have a good camera and a bad algorithm, or vice versa? In this tutorial, I will discuss approaches for tackling this problem. 

3. Role of image quality metrics evolving in the automotive industry 

This is an active area of discussion within P2020. Now that the first release is (hopefully!) about to be published, we are already considering the next set of challenges. There are many different areas, but I personally think colour will be the next important area to tackle. We encode so much information into our road networks through colour: line markings, stop signs, traffic lights, you name it. AI algorithms need to be able to distinguish colour. For example, is the line in the road white or yellow? This information matters because it affects how the autonomous vehicle should behave. 

The majority of colour metrics are based on human perception of colour. In fact, you can make a strong argument that colour is a human perception, not a physical property. For me, an open question is, do we need to be tied to approaches and metrics based on human perception, or is there a better way to quantify colour? A few of us are working on building a consortium to work on this exact problem (Patrick Denny, Paola Iacomussi, Alexander Braun, Robert Dingess). 

Apart from colour, the real value I see in image quality metric development is the ability to accurately quantify how well a camera will support autonomous driving. One major challenge facing the autonomous driving community is that it is very hard to link image quality with AI algorithm performance. This can lead to both over and under-engineering of camera systems. So establishing robust methods of linking camera performance to AI algorithm performance can be used to make system more robust, and potentially cheaper as well. 

4. Why are forums such as AutoSens important for the automotive ADAS and AD industry? 

AutoSens is very important. It is a forum where OEMs, tier 1s, tier 2s and academics can meet, get updates on the latest developments, and talk openly about common challenges. There is an excellent diversity of topics covered in the various sessions, and very vibrant demo sessions.  

The tutorials (and AutoSens Academy) are very important because, for many of the topics covered, there is no reference book, website or YouTube channel where people can get access to this information. You are also learning from people who have practical, hands-on experience, and who know the real-world challenges in their area. You really can’t get access to this knowledge anywhere else. 

Find out more about AutoSens Europe and Book your Full Pass now to access the Technical Tutorials!
Shopping cart0
There are no products in the cart!
0
Scroll to Top
2024 ADAS Guide

The state-of-play in today’s ADAS market

With exclusive editorials from Transport Canada and SAE;  the ADAS Guide is free resource for our community. It gives a detailed overview of features in today’s road-going vehicles, categorized by OEM, alongside expert analysis.