The tutorial will focus on recent advances in sensors and on-board instrumentation for new vehicle generations with driver-assistance capability. The economic and social impact of this application field is huge: every year 90 millions of vehicles are sold worldwide, but 1.25 millions of people are killed due to lack of safety. In US 3.1 billions of gallons of fuel are wasted due to traffic congestion. Assisted driving, and in the next future autonomous driving, will increase safety, and will enable intelligent management of traffic flows. Key enabling technologies for this scenario are the on-board sensing systems and relevant HW-SW acquisition/processing instrumentation for collision avoidance, cruise and brake control, parking assistance, enhanced driver vision, tyre condition monitoring, to name just a few. The tutorial will be divided in 4 Parts. In Part 1 “Introduction”, innovation and market trends in the field of ICT applied to vehicles and intelligent transport systems will be discussed, particularly focusing on next generation of driver-assisted/autonomous vehicles. In Part 2 “Advanced Sensors for Detection and Ranging” real-time sensor acquisition and processing of data from Radar and Lidar will be discussed. A comparison of the two technologies will be also carried out. These sensors aim at detecting if there are obstacles around the vehicle, and at measuring their distance, relative speeds, and directions. Practical examples of multi-channel vehicular Radar systems developed by the instructor will be discussed. Instead, Part 3 “Vision Sensors for Smart Vehicles and ITS” will focus on vision sensors, organized as array of video cameras operating in visible or near infrared spectrum. The problem of reducing the distortions caused by the adoption of large Field of View fish eye lens will be also discussed. Some applications to traffic sign recognition systems, road signs recognition, image mosaicking for all around view during parking assistance, will be discussed. Finally, Part 4 “Sensor Fusion Towards the Autonomous Car” will discuss examples of driver assistance/autonomous navigation by using sensor fusion (i.e. integrating information coming from Radar and Lidar and video camera). An analysis of errors in real-time obstacle tracking will be done. Functional safety issues will be also discussed.

On-board Sensors and Instrumentation for Driver-Assisted/Autonomous Vehicles

SAPONARA, SERGIO
Primo
2017-01-01

Abstract

The tutorial will focus on recent advances in sensors and on-board instrumentation for new vehicle generations with driver-assistance capability. The economic and social impact of this application field is huge: every year 90 millions of vehicles are sold worldwide, but 1.25 millions of people are killed due to lack of safety. In US 3.1 billions of gallons of fuel are wasted due to traffic congestion. Assisted driving, and in the next future autonomous driving, will increase safety, and will enable intelligent management of traffic flows. Key enabling technologies for this scenario are the on-board sensing systems and relevant HW-SW acquisition/processing instrumentation for collision avoidance, cruise and brake control, parking assistance, enhanced driver vision, tyre condition monitoring, to name just a few. The tutorial will be divided in 4 Parts. In Part 1 “Introduction”, innovation and market trends in the field of ICT applied to vehicles and intelligent transport systems will be discussed, particularly focusing on next generation of driver-assisted/autonomous vehicles. In Part 2 “Advanced Sensors for Detection and Ranging” real-time sensor acquisition and processing of data from Radar and Lidar will be discussed. A comparison of the two technologies will be also carried out. These sensors aim at detecting if there are obstacles around the vehicle, and at measuring their distance, relative speeds, and directions. Practical examples of multi-channel vehicular Radar systems developed by the instructor will be discussed. Instead, Part 3 “Vision Sensors for Smart Vehicles and ITS” will focus on vision sensors, organized as array of video cameras operating in visible or near infrared spectrum. The problem of reducing the distortions caused by the adoption of large Field of View fish eye lens will be also discussed. Some applications to traffic sign recognition systems, road signs recognition, image mosaicking for all around view during parking assistance, will be discussed. Finally, Part 4 “Sensor Fusion Towards the Autonomous Car” will discuss examples of driver assistance/autonomous navigation by using sensor fusion (i.e. integrating information coming from Radar and Lidar and video camera). An analysis of errors in real-time obstacle tracking will be done. Functional safety issues will be also discussed.
2017
9781509035960
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11568/838830
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact