Outsight launch at AutoSens

Since 2015, Dibotics has been a pioneer in Smart Machines perception working heavily with Self-Driving Cars. The company brought a contrarian approach to 3D real-time data processing: without Machine Learning or Training Datasets, using very low power, yet delivering enriched and precise information.

Earlier this year Dibotics entered an exciting new phase and launched a new company called Outsight. A while back, those behind Dibotics realised that their software approach, running on a tiny chip, could be leveraged in a much more efficient way on custom hardware (Lidar at that time).

When they met their new associate, Scott Buchter, the inventor of a revolutionary broadband laser imager, they saw the opportunity to create a new type of technology. They decided not to make a smart LiDAR as they were initially thinking, but a totally new device delivering a new kind of 3D data with capabilities beyond what LiDAR and Cameras can do.

Outsight is the result of combining the software capabilities of Dibotics with this new sensing technology, creating what they call a 3D Semantic Camera.


Joining the line-up in Brussels this September, Outsight have a premium spot on the Exhibition floor and feature in our inaugural Technology Showcase on Tuesday 17 September. Their President Founder, Raul Bravo will also be delivering a presentation on “Snow-aware ADAS with Active Hyperspectral Sensing” during day one of the main conference at AutoSens in Brussels 2019.

We wanted to find out more about the transition from Dibotics and Outsight as well as delving into the presentation topic Raul will be delivering in Brussels. Raul generously provided us some of his insights on the matter.


Raul Bravo, President Founder, Outsight
Raul Bravo, President Founder, Outsight

Dibotics pushed the boundaries of 3D perception with its unique data processing

While we agree that Machine Learning is a wonderful tool for certain applications, the objective of Dibotics has been to push the limits of what can be done without Machine Learning or need for datasets. We showed that it was possible, for instance, not only to perform SLAM-on-Chip(R), Object Detection and Tracking, but also Classification of each individual 3D point. Performing point-wise classification without Machine Learning, on a tiny chip in real-time, is almost a heretic statement today, but we showed that it was totally possible, through several projects with emblematic OEMs and Tiers1. This has opened new possibilities that are better leveraged when blended with an outstanding sensing solution on the same device.

Quantum jump in perception and simultaneous understanding for “Smart Machines”, including autonomous vehicles

We think that incremental innovation in sensors like LiDAR or Cameras won’t suffice to deliver the kind of Situation Awareness that Smart Machines require.

Well before fully autonomous cars can be developed, L1 and L2 ADAS must evolve and result in better perception and understanding of the environment to solve challenging situations. In addition to that, provide a safer and smarter driving experience and in some cases a redundant solution with current ADAS solutions for critical applications.

We aim to provide a quantum jump following three axis:

  1. Providing a 3D perception of the environment that surpasses what current LiDAR, Radar and Cameras can do.
  2. Simultaneously processing this data in the same device, to deliver out-of-the-box actionable information, without relying on Machine Learning nor power-hungry processing units.
  3. Unveiling new information, invisible to the human eye, that can directly be leveraged to provide Full Situation Awareness. We’ll show this capability in our booth during AutoSens.

Standing out as Most Exciting Start-Up finalists at the AutoSens Awards

We are thrilled to have been shortlisted for the AutoSens Awards 2019. We believe our submission stands out as we’re not following the typical Start-Up path in several ways.

  • As Dibotics, we certainly made bold statements followed by real field proof, with significant sales and profits from key customers since day one, without a single Euro of outside investment. This shows that it’s possible to combine disruptive approaches with short-term valuable deliverables.
  • Our new move is bold as well, as we are combining two different technologies, with scientists and engineers that normally don’t work together: re-inventing lasers and high-level processing software in the same team is not common. Often software is an afterthought of hardware people or vice-versa.
  • Making a totally unique software, a totally unique hardware and creating unique value that is only possible with the combination of both, has allowed us to attract the best talent to the project, based in San Francisco, Paris and Helsinki. The team has already made major achievements in a matter of months, that we’ll show during AutoSens.
  • Finally, the founding team has significant experience in building deep-tech companies, having founded several start-ups in robotics, AI, Lasers and electronics. We have built multi-national teams of hundreds of people, delivered millions of products to happy customers around the globe and, as a result, distributed half a Billion Euros to investors.

Top team of founders at Outsight

Raul Bravo and Olivier Garcia, beyond the fact of founding Dibotics, have previously developed other companies in mobile robotics, including the only solution for fully automated forklifts that have been working 24h/day for 13 years in challenging production environments like warehouses and automotive factories, relying only in 2D LiDAR sensors to work.

Cédric Hutchings, former VP of Nokia Technologies and ex-CEO/founder of Withings, has extensive experience in leading and scaling international teams, industrializing complex products that combine innovative sensors with embedded software, oftentimes in highly-regulated industries.

Scott Buchter is probably one of the best world experts in laser and hyperspectral technologies. He has put this expertise into practice by creating several companies leveraging his unique knowledge and experience.

The founding team combines the experience of software, robotics, lasers and scaling organizations/products.

We have three objectives in the first year:

  1. To demonstrate that our technology actually delivers on its promises, with concrete deliverables to our first customers, that are helping us from day one to ensure that we’re building the right solution.
  2. To simultaneously build mass-production capabilities. We think that the market requires both a performant and scalable product.
  3. As this is only the beginning, our objective is to build a top-talent world-class organization, with the ability to tackle the future challenges and innovations that we are envisioning.

Challenges in using Hyperspectral imaging to enhance car safety

Raul Bravo will be presenting a session at AutoSens in September – introducing Snow-aware ADAS with Active Hyperspectral Sensing. His aim is to provide Full Situation Awareness, the presentation will use a simple example that will illustrate in a concrete way the value this can create for ADAS applications. The assessment of road conditions and hazards in real-time is not a new subject, but we’re showing how this can be concretely achieved with our remote active detection of ice and snow based on hyperspectral imaging, in real-time.

Even if this is just a simple example of one of our basic capabilities, the positive impact of such a solution can be significant: only in the US, over 1,300 people are killed and more than 116,800 people are injured in vehicle crashes on snowy, slushy or icy pavement annually (US. Department of Transportation) and the associated cost are in excess of 25 Billion USD, based on NHTSA statistics.

Introducing snow and ice detection capabilities in current ADAS solutions is a clear short-term opportunity to achieve a significant reduction in accidents.

Outsight launch at AutoSens in September 2019

Launching in September 2019, we plan to use AutoSens to show in a concrete way how our Full Situation Awareness approach can deliver data and insight that Cameras and LiDARs can’t provide, in several scenarios. The action will not only take place in our booth, as we also have a demonstration taking place showing the capabilities at distance, otherwise it would not be “remote sensing”.

You’ll be able to catch Outsight on the Exhibition Floor, in the Technology Showcase and Raul on stage during the conference on day one in Brussels.

Book your tickets to join Outsight and Raul in Brussels >>

Scroll to Top
2024 ADAS Guide

The state-of-play in today’s ADAS market

With exclusive editorials from Transport Canada and SAE;  the ADAS Guide is free resource for our community. It gives a detailed overview of features in today’s road-going vehicles, categorized by OEM, alongside expert analysis.