Tutorial sessions are offered to AutoSensLEARN attendees. Enhance your AutoSens learning experience by booking your AutoSensLEARN ticket and attending any or all of four of our expert-led tutorials. You can also buy a Tutorial Only pass, if you have a particular interest in only one of these tutorials.
These expert-led tutorials are the perfect in-depth, technical accompaniment to the main conference agenda, covering a range of topics and themes. To attend the tutorials, in addition to the main conference sessions, please book your AutoSensLEARN bundle using the buttons below.
Tutorial 1: The Three Goals of HDR
Date: Wednesday 23 September, 2020
Time: 1pm – 4pm BST
Led by: Alessandro Rizzi, Full Professor and Head of MIPSLab, Department of Computer Science, University of Milan
High Dynamic Range (HDR) imaging is a continuously evolving part of Imaging. More than twenty years ago HDR started to be popular with the seminal paper of Debevec and Malik proposing multiple exposures to attempt to capture a wider range of scene information
Ten-plus years ago interest evolved to recreating HDR scenes by integrating widely-used LCD with LED illumination (Helge Seetzen’s Brightsides Displays). Today, the evolution continues in the current sales of HDR televisions using OLED and Quantum Dot technologies. As well, standards for HDR video media formats remain an active area of research.
This tutorial reviews the science and technology underlying the evolution of HDR imaging from silver-halide photography to HDR TVs. HDR technology is a complex problem controlled by optics, signal-processing and visual limits. The solution depends on its goal.
After a detailed description of the dynamic range problem in image acquisition, this course focuses on standard methods of creating and manipulating HDR images focusing on the different possible goals of the HDR pipeline: reproducing light field, reproducing appearance, improving image aesthetic and visibility. For each goal a careful analysis of characteristics, limits and ground truth will be presented. The course aims at replacing myths with measurements about the limits of accurate camera acquisition (range and color) and the usable range of light for displays presented to human vision. It discusses the principles of tone rendering and the role of HDR spatial comparisons.
- HDR Reproduction History
- HDR principles, devices and techniques
- The 3 HDR goals
- Reproducing original HDR scene: Capture Challenges
- Rendering Appearance for LDR display: Display Challenges
- Improving image aesthetic and visibility: HDR in Human Vision
- Goals, ground-truths and assessment criteria for HDR applications
Tutorial 2: The latest advances in image sensor technology
Date: Wednesday 30 September, 2020
Time: 1pm – 4pm BST
Led by: Prof. Albert Theuwissen, Founder, Harvest Imaging, Belgium
- Numbers (add up to nothing),
- High dynamic range,
- Voltage domain global shutters
- Low noise,
- Colour filter news,
- Phase detective auto-focus pixels,
- The extremes,
- “New” materials,
- Beyond silicon in the near-IR,
- Event-based imagers,
- PTC in the dark,
Tutorial 3: FMCW LiDAR
Date: Tuesday 6 October, 2020
Time: 2-4pm BST
Led by: Tobias J. Kippenberg, Full Professor in the Institute of Physics and Electrical Engineering, EPFL
Tobias J. Kippenberg is Full Professor in the Institute of Physics and Electrical Engineering at EPFL in Switzerland since 2013 and joined EPFL in 2008 as Tenure Track Assistant Professor. Prior to EPFL, he was Independent Max Planck Junior Research group leader at the Max Planck Institute of Quantum Optics in Garching. While at the MPQ he demonstrated radiation pressure cooling of optical micro-resonators, and developed techniques with which mechanical oscillators can be cooled, measured and manipulated in the quantum regime that are now part of the research field of Cavity Quantum Optomechanics. Moreover, his group discovered the generation of optical frequency combs using high Q micro-resonators, a principle known now as micro-combs or Kerr combs.
For his early contributions in these two research fields, he has been recipient of the EFTF Award for Young Scientists (2011), The Helmholtz Prize in Metrology (2009), the EPS Fresnel Prize (2009), ICO Award (2014), Swiss Latsis Prize (2015), as well as the Wilhelmy Klung Research Prize in Physics (2015) and the 2018 ZEISS Research Award. Moreover, he is 1st prize recipient of the “8th European Union Contest for Young Scientists” in 1996 and is listed in the Highly Cited Researchers List of 1% most cited Physicists in 2014-2018. He is founder of the startup LIGENTEC SA, an integrated photonics foundry.
Tutorial 4: Autonomous Driving with ROS
Date: Wednesday 7 October, 2020
Time: 1pm – 3pm BST
Led by: Jeremy Lebon, Lecturer / Researcher, VIVES University of Applied Sciences
As ROS was primarily developed for the classical robot. A lot of the code and functionalities have analogies with autonomous driving. In the tutorial, these analogies will be handled and clarified on a practical use case.