Radar-camera fusion for object detection and semantic segmentation in autonomous driving: A comprehensive review
Driven by deep learning techniques, perception technology in autonomous driving has
developed rapidly in recent years, enabling vehicles to accurately detect and interpret …
developed rapidly in recent years, enabling vehicles to accurately detect and interpret …
Radars for autonomous driving: A review of deep learning methods and challenges
Radar is a key component of the suite of perception sensors used for safe and reliable
navigation of autonomous vehicles. Its unique capabilities include high-resolution velocity …
navigation of autonomous vehicles. Its unique capabilities include high-resolution velocity …
Multimodal semantic segmentation in autonomous driving: A review of current approaches and future perspectives
The perception of the surrounding environment is a key requirement for autonomous driving
systems, yet the computation of an accurate semantic representation of the scene starting …
systems, yet the computation of an accurate semantic representation of the scene starting …
Echoes beyond points: Unleashing the power of raw radar data in multi-modality fusion
Radar is ubiquitous in autonomous driving systems due to its low cost and good adaptability
to bad weather. Nevertheless, the radar detection performance is usually inferior because its …
to bad weather. Nevertheless, the radar detection performance is usually inferior because its …
Deep transfer learning for intelligent vehicle perception: A survey
Deep learning-based intelligent vehicle perception has been develo** prominently in
recent years to provide a reliable source for motion planning and decision making in …
recent years to provide a reliable source for motion planning and decision making in …
RadarGNN: Transformation invariant graph neural network for radar-based perception
F Fent, P Bauerschmidt… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
A reliable perception has to be robust against challenging environmental conditions.
Therefore, recent efforts focused on the use of radar sensors in addition to camera and lidar …
Therefore, recent efforts focused on the use of radar sensors in addition to camera and lidar …
Radarverses in metaverses: A CPSI-based architecture for 6S radar systems in CPSS
Metaverses have caused significant changes in the industry and their academic foundation
can be traced back to the term cyber–physical–social systems (CPSS), which was proposed …
can be traced back to the term cyber–physical–social systems (CPSS), which was proposed …
[HTML][HTML] Vehicle-to-everything (V2X) in the autonomous vehicles domain–A technical review of communication, sensor, and AI technologies for road user safety
Autonomous vehicles (AV) are rapidly becoming integrated into everyday life, with several
countries anticipating their inclusion in public transport networks in the coming years. Safety …
countries anticipating their inclusion in public transport networks in the coming years. Safety …
RCBEVDet: Radar-camera Fusion in Bird's Eye View for 3D Object Detection
Three-dimensional object detection is one of the key tasks in autonomous driving. To reduce
costs in practice low-cost multi-view cameras for 3D object detection are proposed to …
costs in practice low-cost multi-view cameras for 3D object detection are proposed to …
Dual radar: A multi-modal dataset with dual 4d radar for autononous driving
Radar has stronger adaptability in adverse scenarios for autonomous driving environmental
perception compared to widely adopted cameras and LiDARs. Compared with commonly …
perception compared to widely adopted cameras and LiDARs. Compared with commonly …