University of Surrey

Test tubes in the lab Research in the ATI Dance Research

Towards Camera-LIDAR Fusion-Based Terrain Modelling for Planetary Surfaces: Review and Analysis

Shaukat, A, Blacker, PC, Spiteri, C and Gao, Y (2016) Towards Camera-LIDAR Fusion-Based Terrain Modelling for Planetary Surfaces: Review and Analysis Sensors, 16 (11).

[img]
Preview
Text
sensors-16-01952-v2.pdf - Version of Record
Available under License : See the attached licence file.

Download (23MB) | Preview
[img]
Preview
PDF (licence)
SRI_deposit_agreement.pdf
Available under License : See the attached licence file.

Download (33kB) | Preview

Abstract

: In recent decades, terrain modelling and reconstruction techniques have increased research interest in precise short and long distance autonomous navigation, localisation and mapping within field robotics. One of the most challenging applications is in relation to autonomous planetary exploration using mobile robots. Rovers deployed to explore extraterrestrial surfaces are required to perceive and model the environment with little or no intervention from the ground station. Up to date, stereopsis represents the state-of-the art method and can achieve short-distance planetary surface modelling. However, future space missions will require scene reconstruction at greater distance, fidelity and feature complexity, potentially using other sensors like Light Detection And Ranging (LIDAR). LIDAR has been extensively exploited for target detection, identification, and depth estimation in terrestrial robotics, but is still under development to become a viable technology for space robotics. This paper will first review current methods for scene reconstruction and terrain modelling using cameras in planetary robotics and LIDARs in terrestrial robotics; then we will propose camera-LIDAR fusion as a feasible technique to overcome the limitations of either of these individual sensors for planetary exploration. A comprehensive analysis will be presented to demonstrate the advantages of camera-LIDAR fusion in terms of range, fidelity, accuracy and computation.

Item Type: Article
Subjects : Electronic Engineering
Divisions : Faculty of Engineering and Physical Sciences > Electronic Engineering
Authors :
AuthorsEmailORCID
Shaukat, AUNSPECIFIEDUNSPECIFIED
Blacker, PCUNSPECIFIEDUNSPECIFIED
Spiteri, CUNSPECIFIEDUNSPECIFIED
Gao, YUNSPECIFIEDUNSPECIFIED
Date : 20 November 2016
Identification Number : https://doi.org/10.3390/s16111952
Copyright Disclaimer : Copyright 2016 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/).
Uncontrolled Keywords : 3-D reconstruction; terrain modelling; LIDAR-camera fusion; planetary surface perception; hybrid vision systems
Additional Information : (This article belongs to the Special Issue Vision-Based Sensors in Field Robotics)
Depositing User : Symplectic Elements
Date Deposited : 22 Nov 2016 16:06
Last Modified : 22 Nov 2016 16:06
URI: http://epubs.surrey.ac.uk/id/eprint/812938

Actions (login required)

View Item View Item

Downloads

Downloads per month over past year


Information about this web site

© The University of Surrey, Guildford, Surrey, GU2 7XH, United Kingdom.
+44 (0)1483 300800