Towards deep radar perception for autonomous driving: Datasets, methods, and challenges

Y Zhou, L Liu, H Zhao, M López-Benítez, L Yu, Y Yue - Sensors, 2022 - mdpi.com
With recent developments, the performance of automotive radar has improved significantly.
The next generation of 4D radar can achieve imaging capability in the form of high …

Sensing and machine learning for automotive perception: A review

A Pandharipande, CH Cheng, J Dauwels… - IEEE Sensors …, 2023 - ieeexplore.ieee.org
Automotive perception involves understanding the external driving environment and the
internal state of the vehicle cabin and occupants using sensor data. It is critical to achieving …

Simple-bev: What really matters for multi-sensor bev perception?

AW Harley, Z Fang, J Li, R Ambrus… - … on Robotics and …, 2023 - ieeexplore.ieee.org
Building 3D perception systems for autonomous vehicles that do not rely on high-density
LiDAR is a critical research problem because of the expense of LiDAR systems compared to …

SMURF: Spatial multi-representation fusion for 3D object detection with 4D imaging radar

J Liu, Q Zhao, W **ong, T Huang… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
The 4D millimeter-Wave (mmWave) radar is a promising technology for vehicle sensing due
to its cost-effectiveness and operability in adverse weather conditions. However, the …

Radar-camera fusion for object detection and semantic segmentation in autonomous driving: A comprehensive review

S Yao, R Guan, X Huang, Z Li, X Sha… - IEEE Transactions …, 2023 - ieeexplore.ieee.org
Driven by deep learning techniques, perception technology in autonomous driving has
developed rapidly in recent years, enabling vehicles to accurately detect and interpret …

Proxyformer: Proxy alignment assisted point cloud completion with missing part sensitive transformer

S Li, P Gao, X Tan, M Wei - … of the IEEE/CVF conference on …, 2023 - openaccess.thecvf.com
Problems such as equipment defects or limited viewpoints will lead the captured point
clouds to be incomplete. Therefore, recovering the complete point clouds from the partial …

Sparse instance conditioned multimodal trajectory prediction

Y Dong, L Wang, S Zhou, G Hua - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Pedestrian trajectory prediction is critical in many vision tasks but challenging due to the
multimodality of the future trajectory. Most existing methods predict multimodal trajectories …

RaLiBEV: Radar and LiDAR BEV fusion learning for anchor box free object detection systems

Y Yang, J Liu, T Huang, QL Han… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
In autonomous driving, LiDAR and radar are crucial for environmental perception. LiDAR
offers precise 3D spatial sensing information but struggles in adverse weather like fog …

Azimuth super-resolution for fmcw radar in autonomous driving

YJ Li, S Hunt, J Park, M O'Toole… - Proceedings of the …, 2023 - openaccess.thecvf.com
We tackle the task of Azimuth (angular dimension) super-resolution for Frequency
Modulated Continuous Wave (FMCW) multiple-input multiple-output (MIMO) radar. FMCW …

Which framework is suitable for online 3d multi-object tracking for autonomous driving with automotive 4d imaging radar?

J Liu, G Ding, Y **a, J Sun, T Huang… - 2024 IEEE Intelligent …, 2024 - ieeexplore.ieee.org
Online 3D multi-object tracking (MOT) has recently received significant research interests
due to the expanding demand of 3D perception in advanced driver assistance systems …