A Multi-Sensor Simulation Environment for Autonomous Cars



Song, Rui ORCID: 0000-0002-8695-1522, Horridge, Paul, Pemberton, Simon, Wetherall, Jon, Maskell, Simon ORCID: 0000-0003-1917-2913 and Ralph, Jason ORCID: 0000-0002-4946-9948
(2019) A Multi-Sensor Simulation Environment for Autonomous Cars. In: 2019 22th International Conference on Information Fusion (FUSION), 2019-7-2 - 2019-7-5, OTTAWA, CANADA.

[img] Text
A Multi-Sensor Simulation Environment for Autonomous Cars.pdf - Author Accepted Manuscript

Download (7MB) | Preview

Abstract

This paper describes a multi-sensor simulation environment. This environment is being used to develop tracking methods to improve the accuracy of environmental perception and obstacle detection for autonomous vehicles. The system is being developed as part of a collaborative project entitled: Artificial Learning Environment for Autonomous Driving (ALEAD). The system currently incorporates a range of different sensor models, such as camera, infrared (IR) camera and LiDAR, with radar and GNSS-aided navigation systems to be added at a later stage. Each sensor model has been developed to be as realistic as possible-incorporating physical defects and other artefacts found in real sensors. This paper describes the environment, sensors and demonstrates the use of a Kalman filter based tracking algorithm to fuse data to predict the trajectories of dynamic obstacles. The multi-sensor tracking system has been tested to track a ball bouncing in a 3D environment constructed using Unity3D software.

Item Type: Conference or Workshop Item (Unspecified)
Uncontrolled Keywords: multi sensors, autonomous driving, visual tracking, virtual environment
Depositing User: Symplectic Admin
Date Deposited: 18 Mar 2020 11:53
Last Modified: 15 Mar 2024 03:20
DOI: 10.23919/fusion43075.2019.9011278
Related URLs:
URI: https://livrepository.liverpool.ac.uk/id/eprint/3079343