Back to articles
Articles
Volume: 30 | Article ID: art00009
Image
Multi-sensor fusion for Automated Driving: Selecting model and optimizing on Embedded platform
  DOI :  10.2352/ISSN.2470-1173.2018.17.AVM-256  Published OnlineJanuary 2018
Abstract

Automated Driving requires fusing information from multitude of sensors such as cameras, radars, lidars mounted around car to handle various driving scenarios e.g. highway, parking, urban driving and traffic jam. Fusion also enables better functional safety by handling challenging scenarios such as weather conditions, time of day, occlusion etc. The paper gives an overview of the popular fusion techniques namely Kalman filters and its variation e.g. Extended Kalman filters and Unscented Kalman filters. The paper proposes choice of fusing techniques for given sensor configuration and its model parameters. The second part of paper focuses on efficient solution for series production using embedded platform using Texas Instrument's TDAx Automotive SoC. The performance is benchmarked separately for "predict" and "update" phases on for different sensor modalities. For typical L3/L4 automated driving consisting of multiple cameras, radars and lidars, fusion can supported in real time by single DSP using proposed techniques enabling cost optimized solution.

Subject Areas :
Views 36
Downloads 6
 articleview.views 36
 articleview.downloads 6
  Cite this article 

Shyam Jagannathan, Mihir Mody, Jason Jones, Pramod Swami, Deepak Poddar, "Multi-sensor fusion for Automated Driving: Selecting model and optimizing on Embedded platformin Proc. IS&T Int’l. Symp. on Electronic Imaging: Autonomous Vehicles and Machines,  2018,  pp 256-1 - 256-5,  https://doi.org/10.2352/ISSN.2470-1173.2018.17.AVM-256

 Copy citation
  Copyright statement 
Copyright © Society for Imaging Science and Technology 2018
72010604
Electronic Imaging
2470-1173
Society for Imaging Science and Technology