We caught up with ON Semiconductor ahead of AutoSens in Brussels this September. They chatted with us all about their portfolio expansion into LiDAR in automotive; their new modular automotive reference system; and the most exciting technology change they expect to see in the industry in 2018…
Great to have ON Semiconductor take part at AutoSens again, what’s new?
We continue to expand our breath of sensing technologies. ON Semiconductor has recently acquired SensL Technologies, a technology leader specializing in Silicon Photomultipliers (SiPM), Single Photon Avalanche Diode (SPAD) and LiDAR sensing products for automotive, medical, industrial and consumer markets. This follows on the heels of our radar investment 15 months ago. We will be sampling radar products later this year. The investment in SensL now expands our portfolio to LiDAR in automotive.
We are seeing a shift in the market for level 4 & 5 programs for both established companies as well as new entrants, transitioning into more formalized scale-up plans moving from single vehicles to ten, tens of vehicles to hundreds, etc. We are delighted to be engaging with ecosystem partners and customers on these advances in autonomous driving.
One particular innovation we saw, the ‘modular automotive reference system’, looks like a great tool for developing vehicle vision system – what brought that about?
As we looked at how to make our technologies more accessible to our ecosystem partners and customers, we realized that designing a camera was sometimes a barrier for them being able to use, test and experiment with our sensors. As we broaden our ecosystem engagement we added more partners that will never design a camera themselves. They are developing algorithms. They are building technologies around camera systems. What they really needed was access to a camera platform that would give them the ability to experiment with different combinations. So, we came up with the Modular Automotive Reference System (MARS) that allows us not to have to design a new board every time we have a new sensor or a new ISP and rapidly enable its use with other parts of camera technologies.
We wanted modularity so we could iterate or make changes to the imaging board but leverage all the work that had been done for a specific ISP, a partner’s ISP, or a different IO technologies. We extended this with our Demo3 head board so this array of camera options can be plugged into a desktop PC running our Devware suite of tools or into an embedded ECU from our ecosystem partners enabling experimentation and development work to go from laboratory benchtop out to road testing faster with greater flexibility.
How are you involved with IEEE SA P2020?
ON Semiconductor is participating in the four currently active sub-committees:
Sub-committee 0: White paper and glossary
Sub-committee 1: LED Flicker
Sub-committee 2: Image quality for human viewing
Sub-committee 3: Image quality for machine vision
We have contributed content to the white paper, and are actively working on the test procedures and KPIs in the other sub-committees.
Your main focus at our previous event, in Brussels, was Cybersecurity and Functional Safety – do you feel the automotive industry is learning fast enough in those areas?
We are seeing a surge in interest, depth of analysis and deployment plans for both Functional Safety and Cybersecurity features. We believe we are a bit ahead of the curve since we chose to invest in these features in hardware aggressively. We feel they are critical for next generation capabilities in ADAS and absolutely crucial for autonomous driving.
One of our panel discussions at AutoSens Detroit is “How many cameras is enough?” – assuming we’re talking about L5 autonomy, what do you think?
The industry is still in a stage of discovery and learning. We see a range of applications where some customers are using higher quantities of lower resolution cameras and others going for higher capability imagers in fewer quantities per vehicle. We do not believe there is a single right answer. So, we provide an array of options to let the OEMs and algorithm providers choose sensors that meet their target capabilities, system architecture and price points.
ON Semiconductor has been recognised as one of the world’s most ethical companies for the past 2 years running, how does that translate for day-to-day working for employees?
We are flattered and pleased to be recognized three years in a row as one of the world’s most ethical companies. We have a very large business in automotive. An industry that is driven by long term relationships, it is good business to behave ethically. Companies that do not may have short term success, but we are in it for the long term.
What’s the most exciting technology change in industry you expect to see in 2018?
To see more advanced systems that leverage multiple sensor modalities in an increasingly integrated and sophisticated way is exciting. This sensor fusion trend is exactly why we have invested in broadening of our portfolio of sensor technologies to ensure we can provide cutting edge solutions to our customers.
What are you most looking forward to at AutoSens in 2018?
Feeling the pulse of what the industry is doing, and having a chance to have face-to-face conversations with our customers and partners. Hearing presentations from others and learning about new ideas from across the industry.