Paper The following article is Open access

"Seeing" the Invisible: Under Vehicle Reconstruction (UVR) for Surround View Visualization

, and

Published under licence by IOP Publishing Ltd
, , Citation Feng Hu PhD et al 2022 J. Phys.: Conf. Ser. 2330 012015 DOI 10.1088/1742-6596/2330/1/012015

1742-6596/2330/1/012015

Abstract

Providing blind-spot-free vehicle surround view to the driver is important for many driving maneuvers such as parking. Existing vehicle Surround View System (SVS) can only visualize front, left, rear and right side of the vehicle but leaves the under vehicle area unknown. However, perceiving the under vehicle area is critical for many tasks such as passing through speed bumps, avoiding potholes, driving on narrow roads with high curbs or the unpaved terrain. In this paper, we propose a novel Under Vehicle Reconstruction (UVR) algorithm which utilizes what the vehicle sees in the past and vehicle egomotion to "see" through the original invisible under vehicle area. First, front or back fisheye cameras, are utilized to build a local textured map for future usage. Second, vehicle's precise location and orientation within the local map is estimated using the vehicle egomotion. Finally, correspondent under vehicle area texture is retrieved from the map using vehicle's pose and stitched together with traditional Surround View System to provide a new blind-spot-free visualization. As far as we know, our work is the first solution that can provide full under vehicle area reconstruction which empowers many Advanced Driving Assistant System (ADAS) functionalities such as transparent hood or transparent vehicle. Experiments on both simulated and real data are presented to show the effectiveness and robustness of the proposed algorithm.

Export citation and abstract BibTeX RIS

Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.

Please wait… references are loading.
10.1088/1742-6596/2330/1/012015