Addressing the image sensor requirement for autonomous driving challenge with Abhay Rai, Sony Electronics

abhay rai 1
Abhay Rai, Director Product Marketing: Automotive Imaging at Sony Electronics.

Next in our series of interviews with the AutoSens Community, we caught up with Abhay Rai, Director Product Marketing: Automotive Imaging at Sony Electronics.

Abhay is presenting a session at AutoSens in Brussels on “Addressing the image sensor requirement for autonomous driving challenge”

In what ways have engineers improved on nature’s design (i.e. the human eye)?

In last decade, speed and scale of innovation has tremendously impacted all areas of technology. In the area of CMOS image sensing, due to the rapid innovation, today sensors can surpass the human eye capability.

This is done not only because of clever design, but to a large extent by total control of manufacturing. Dedicated CMOS fabrication has provided tight linkage between design and manufacturing environment resulting in amazing technological breakthrough.

Is anyone looking at animals with more complex optical sensors, like an Octopus or Mantis Shrimp?

Nature has extraordinary ability to adapt to various situations. Engineers always look at this as a golden reference.

An engineering balance lies in looking at what nature offers and providing an engineering solution that can meet the performance at a particular cost point, form factor and help commercial adaptation.

Apart from field of view and weather conditions, what are sensor problems for autonomous driving?

Few years back, automotive image sensors were focusing on non-safety critical applications (eg rear view). Image sensors had to work for very short time (less than two minute) in each flight.

Due to the emergence of safety critical application, basic performance requirements have changed. Sensors need to reliably work for long time. Reliable operating temperature range, functional safety, security, power, and heat are very important performance parameters that a safety critical system design need to address.

Challenge for an image sensor for autonomous driving is: Can a sensor still see well under all real world conditions (reliable working and always working)

Why can’t we just use dashcam footage on YouTube for finding problems?

Dashcam footage can be a datapoint but autonomous driving system requires more real world data and real world simulation to get to the best, robust and the safest autonomous driving car.

Who should come to your session?

Anyone who feels morally obligated to put best technology and mind-set to build the safest car by choosing the best ingredients. Saving life comes with great and serious responsibility.  When 5000 pounds car crashes, it carries momentum and puts many lives at stake. System designers needs to make conscious choice to put best technology available.

10 years ago, I had 20/20 vision, but I’m coming up to 40 and don’t any more, but I still have the same car.  Shouldn’t we worry that safety critical technology might degrade just as we do?

Absolutely we should. This is where trustworthiness is a very critical factor. Functional safety should be a priority and sensor should provide best low light sensitivity at lowest dark current to enable object recognition in most difficult circumstances.

You’re a senior member of IEEE – tell us about your involvement?

My involvement is to educate engineering community. Specsmanship is dangerous and against the concept of putting the safest car on the road. My responsibility is to educate engineering community to come up with a proper and standardized way to define the critical specifications for the vision sub systems.

What’s your dream car, and why?  (and the embarrassing one: what do you really drive?)

My dream car is the safest car in which no lives are lost due to safety issue. It gains my full trust and provides utmost comfort. A car I can trust to put my family in.

Today I drive a battery powered car which is work in progress.

You’ve filed several patents – any highlights you are particularly proud of?

I have filed several patents and trade secret in the area of human interface. Human and machine interaction is very important. Interface has to be very seamless and natural. This is true even in the area of autonomous driving. Interface has to be seamless for technology to work at different level of autonomous driving.

What are you most looking forward to at AutoSens?

Besides meeting key people of industry, identifying key companies who value the system and components’ performance in their quest to build the safety critical system and team up with the partners who take safety and reliability as a priority.

Scroll to Top
2024 ADAS Guide

The state-of-play in today’s ADAS market

With exclusive editorials from Transport Canada and SAE;  the ADAS Guide is free resource for our community. It gives a detailed overview of features in today’s road-going vehicles, categorized by OEM, alongside expert analysis.