Zhou, Yi, Liu, Lulu, Zhao, Haocheng, Lopez-Benitez, Miguel ORCID: 0000-0003-0526-6687, Yu, Limin and Yue, Yutao
ORCID: 0000-0003-4532-0924
(2022)
Towards Deep Radar Perception for Autonomous Driving: Datasets, Methods, and Challenges.
SENSORS, 22 (11).
4208-.
Text
Sensors_2022.pdf - Author Accepted Manuscript Download (2MB) | Preview |
Abstract
With recent developments, the performance of automotive radar has improved significantly. The next generation of 4D radar can achieve imaging capability in the form of high-resolution point clouds. In this context, we believe that the era of deep learning for radar perception has arrived. However, studies on radar deep learning are spread across different tasks, and a holistic overview is lacking. This review paper attempts to provide a big picture of the deep radar perception stack, including signal processing, datasets, labelling, data augmentation, and downstream tasks such as depth and velocity estimation, object detection, and sensor fusion. For these tasks, we focus on explaining how the network structure is adapted to radar domain knowledge. In particular, we summarise three overlooked challenges in deep radar perception, including multi-path effects, uncertainty problems, and adverse weather effects, and present some attempts to solve them.
Item Type: | Article |
---|---|
Uncontrolled Keywords: | automotive radars, radar signal processing, object detection, multi-sensor fusion, deep learning, autonomous driving |
Divisions: | Faculty of Science and Engineering > School of Electrical Engineering, Electronics and Computer Science |
Depositing User: | Symplectic Admin |
Date Deposited: | 30 May 2022 13:51 |
Last Modified: | 18 Jan 2023 21:00 |
DOI: | 10.3390/s22114208 |
Related URLs: | |
URI: | https://livrepository.liverpool.ac.uk/id/eprint/3155698 |