Driving the revolution towards electrified vehicles with ON Semiconductor
We chat to industry powerhouse and lead sponsor of AutoSens 2020 ON Semiconductor, to hear their response to the impact of COVID-19 on automotive and to find out about their latest technology advances. They discuss their LiDAR Partner Program, latest analysis on HDR image sensors, developments in safety and how we can work towards removing barriers for testing.
2020 has been a very different year for all of us, would you like to share ON Semiconductor’s experience and how you see the future impact of COVID-19 on the automotive industry playing out?
The challenges we are experiencing in 2020 are unprecedented. While global economies have been challenged during these trying times, innovation has not stopped. As Albert Einstein once said: In the middle of difficulty lies opportunity! We have seen many opportunities in 2020. During the COVID-19 related lockdowns we could see air quality significantly improve in major cities and regions worldwide! This is pushing the revolution towards electrified vehicles. Now the larger population and governments have proof that we can improve our air quality. We have the technology in place so let’s use it!
During this same time, deliveries in critical areas of critical merchandise was challenged. Fully autonomous vehicles were deployed to deliver goods to people in need in critical areas and COVID test samples from hospitals to test laboratories avoiding any potential contamination / risk to drivers. So the drive towards fully electrified and autonomous vehicles will certainly accelerate.
What are the current development areas that you are focusing on?
ON Semiconductor has been investing in power, analog and sensing solutions for Electrical Vehicles and ADAS / AV. We are the market leaders for Automotive Image sensors and Ultrasonic Sensor interface circuits and the #2 supplier for automotive power devices.
ON Semiconductor is the only automotive semiconductor supplier to master all four sensing modalities (ultrasonic, image, radar, lidar) needed to enable the ADAS / AV revolution. Our innovation is not only driving the sensor performance (more resolution, better definition, improved behaviour in extreme conditions), but also on system solutions (ISO 26262 Functional Safety support, Cybersecurity, PMIC / Power supplies, Connectivity) that are required to achieve the required system performance in the harshest environments.
We are also relentlessly innovating in the power area to enable the market shift from tradition 12V ICE vehicles to BEVs with solutions in Silicon / Silicon Carbide / GaN ranging from 40V to 1200V.
At the beginning of the year you launched the LiDAR Partners Program. Can you summarise the aims of this Program? How has this developed in the last 8 months? Are there any more partners to announce?
The LiDAR Partners Program launched in the beginning of 2020. The LiDAR Partners Program is an ecosystem consisting of module manufacturers, module customization partners, and component suppliers with the goal for our customers and complimentary semiconductor suppliers to connect in a formal way.
It offers opportunities for LiDAR module manufacturers to benefit from our expertise, reference designs, and component partner ecosystem when designing LiDAR solutions, especially as LiDAR designs are moving to higher performance, lower cost solutions based on our silicon photomultiplier detectors. The Partners Program also allows OEMs to connect to these module suppliers. If an OEM has specific design requirements, the ability to connect with a customization partner specialized in manufacturing based on reference designs makes turning custom specifications into a product easier for OEMs that do not want to design or manufacture their own LiDAR sensor. We have continued to develop this program with a focus on the readout and illumination circuits with component suppliers as well, with a plan to publish more reference designs to offer system solutions to the LiDAR ecosystem.
Our public LiDAR partners currently include: Blickfeld, RoboSense, and SOS Lab. Additional announcements are expected as these companies’ LiDARs start production in 2021, along with more ecosystem partners. More details on our LiDAR Partners Program and ON Semiconductor’s wider ecosystem partners can be found here.
On the opening day of AutoSens Brussels, Sergey Velichko presented a comparative analysis of modern high dynamic range (HDR) automotive image sensors, could you summarise the outcome of this analysis? Which sensors were found to provide the best foundation for object detection?
Many high dynamic range (HDR) automotive image sensors exist on the market and most of them have good performance. However, good is not enough when used in automotive ADAS systems. How can we rely on a ‘good enough’ sensor when lives are at stake?!? We need to save lives and cannot settle for good performance. The difference in performance between the best sensor and a good sensor, can mean that a vulnerable road user (VRU) is or is not detected in all conditions (bright sunny day or at night) with sufficient margin to allow the system to react and take action. Detecting the VRU 50 – 100m earlier can be the difference between life and death.
ON Semiconductor has many years of experience with over 400M automotive image sensors on the road. We have pioneered pixel technology and patented many different pixel architectures. This experience and know-how helps us understand the limitations of the different pixel technologies. Split-pixel architectures show very good behaviour in terms of motion artefacts and LED Flicker Mitigation (LFM) while Multi-exposure architectures are superior in terms of high dynamic range (HDR), low-light, noise, sharpness and color fidelity. With the development of the Super-Exposure (SE) pixel architecture, ON Semiconductor’s newest generation of image sensors combine the best of the best of Spilt-pixel and Multi-exposure architectures. Super-exposure sensors provide the best image quality, colors and details across the full automotive temperature range exceeding detection and classification capabilities of split-pixel architectures in certain lighting conditions – i.e. split-pixel sensors exhibit visible noise and spectral / angular shift of colors in mid-to-high lighting conditions.
Super-exposure image sensors present the best performance to allow stable object detection and high image quality, especially in corner cases, requiring overall less system training and system development costs for the OEM / Tier1. SE based sensors improve ADAS system safety metrics for a given speed (= a VRU can be detected earlier) or enable additional use cases at higher speeds.
This enhancement in performance can be the difference between saving or not saving a life! How can we settle for ‘good enough’?
Safety metrics have been an important discussion topic recently. What role do you as a sensor company play in ensuring the ADAS or AV system is as safe as possible?
Sensors are playing a key role in the deployment of ADAS / AV systems. These are the eyes and ears of the vehicle and provide the data to help the vehicle understand the surrounding environment. The vehicle, though, needs to operate in all lighting conditions (day, night, entering / exiting a tunnel in bright sunlight, etc) and in all weather conditions. Sensor performance is improving with every generation extending the ranges within they operate. Today automotive image sensors are higher definition (>8MP), higher dynamic range (>140dB), improved image quality allowing to detect VRU earlier and with greater margin. In addition, this improved performance is opening the road for higher levels of safety and autonomous driving functionality by enabling better detection of lanes and dangerous objects on the road, recognizing speed limit information from traffic signs, distinguishing between red, yellow and green signals from traffic lights, etc. Radar and Lidar sensors are experiencing similar performance enhancements. The availability of these diverse and complementary sensing modalities are enabling higher levels of autonomous functionality in all weather and driving conditions. ON Semiconductor is working with the key players of the AV ecosystem (OEM, Tier1 , Algorithm suppliers, Computing suppliers) to understand the breadth and depth of the domain and use cases so we may design and deliver sensors enabling the transition from Level 2-3 to higher levels of autonomous driving.
How can we quantify safety for the consumer, and translate the amazing technical work going on at sensor and system level into increased consumer demand for higher safety levels?
According to the WHO 1.4M deaths globally are due to road traffic accidents (www.who.int/violence_injury_prevention/road_safety_status/2018/en). The Boston Consulting estimated that 28% of crashes in the USA can be prevented with ADAS systems (https://www.bcg.com/press/29september2015-roadmap-to-safer-driving-through-driver-assistance-systems). According to NHTSA 94% of serious crashes are caused by driver behaviour (https://www.nhtsa.gov/press-releases/usdot-releases-2016-fatal-traffic-crash-data). These statistics alone are a clear proof that we need to continue developing technologies that can enable vehicles to help the proactive prevention of accidents. Mandates and laws from governments around the globe are one of main driving forces behind this exciting development. As these technologies are deployed and recognized (via safety ratings, promotion, insurance premium incentives, etc), consumer acceptance will rapidly increase around the globe. As consumer acceptance and trust increases, today’s advanced safety system will be tomorrow’s standard equipment and will help accelerate the transition to higher autonomous driving levels. On another side, the value of autonomous driving for consumers is clearly in saving precious resources and time as well as bringing higher levels of transportation comfort. It is a truly revolutionary change for society. Transformation of this scale will take years but right now with building higher safety level ADAS and better sensors enabling the car industry to pave the way toward the ultimate fully autonomous driving goal.
In a recent blog post on onsemi.com it was stated that “the government will need to play its part in reducing barriers so that innovative companies can put more trials on the street, and move to larger-scale deployment.” What role do you see the industry and ON Semiconductor playing here?
The level of testing required for L4 / L5 is several order of magnitudes more complex than for L0 / L1. While simulation and laboratory testing is definitely useful, significant testing needs to be performed in real world conditions. Models need to be trained, corner situations discovered, data gathered and miles driven. The best test environment is the real world. It is up to the industry to prove to governments that these tests are safe but we need governments to allow these trials for large scale deployment.
What can we expect from ON Semiconductor in the second half of AutoSens Brussels in October?
As the leader in automotive sensing technologies for ADAS / AV, we will continue to bring to market new solutions for both inside and outside the vehicle. These solutions will range from new viewing / sensing products to the release of automotive qualified SiPM arrays. Stay tuned!
Joseph Notaro is an accomplished technology industry executive with more than 25 years of experience in the semiconductor market. He has successfully lead R&D, business / market development and worldwide sales organizations. Joseph has extensive international experience across the Americas, Asia Pacific and Europe.
Currently, Joseph is the Vice President of WW Automotive Strategy and Business Development at ON Semiconductor. In this role, he leads the automotive strategic marketing group responsible for defining company strategy addressing the automotive end-market working closely with the Executive team and Business Units. He has participated in several acquisitions, supports the company’s annual strategic planning process and is leading the Automotive OEM Business Development teams.