Multi-sensor Fusion and Cooperative Perception for Autonomous Driving: A Review

C **ang, C Feng, X **e, B Shi, H Lu… - IEEE Intelligent …, 2023 - ieeexplore.ieee.org
Autonomous driving (AD), including single-vehicle intelligent AD and vehicle–infrastructure
cooperative AD, has become a current research hot spot in academia and industry, and …

Automotive LiDAR technology: A survey

R Roriz, J Cabral, T Gomes - IEEE Transactions on Intelligent …, 2021 - ieeexplore.ieee.org
Nowadays, and more than a decade after the first steps towards autonomous driving, we
keep heading to achieve fully autonomous vehicles on our roads, with LiDAR sensors being …

An outline of multi-sensor fusion methods for mobile agents indoor navigation

Y Qu, M Yang, J Zhang, W **e, B Qiang, J Chen - Sensors, 2021 - mdpi.com
Indoor autonomous navigation refers to the perception and exploration abilities of mobile
agents in unknown indoor environments with the help of various sensors. It is the basic and …

Automatic extrinsic calibration method for lidar and camera sensor setups

J Beltrán, C Guindel, A De La Escalera… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
Most sensor setups for onboard autonomous perception are composed of LiDARs and
vision systems, as they provide complementary information that improves the reliability of the …

Targetless extrinsic calibration of multiple small FoV LiDARs and cameras using adaptive voxelization

X Liu, C Yuan, F Zhang - IEEE Transactions on Instrumentation …, 2022 - ieeexplore.ieee.org
Determining the extrinsic parameter between multiple light detection and rangings (LiDARs)
and cameras is essential for autonomous robots, especially for solid-state LiDARs, where …

Automatic targetless LiDAR–camera calibration: a survey

X Li, Y **ao, B Wang, H Ren, Y Zhang, J Ji - Artificial Intelligence Review, 2023 - Springer
The recent trend of fusing complementary data from LiDARs and cameras for more accurate
perception has made the extrinsic calibration between the two sensors critically important …

RGGNet: Tolerance aware LiDAR-camera online calibration with geometric deep learning and generative model

K Yuan, Z Guo, ZJ Wang - IEEE Robotics and Automation …, 2020 - ieeexplore.ieee.org
Accurate LiDAR-camera online calibration is critical for modern autonomous vehicles and
robot platforms. Dominant methods heavily rely on hand-crafted features, which are not …

Soac: Spatio-temporal overlap-aware multi-sensor calibration using neural radiance fields

Q Herau, N Piasco, M Bennehar… - Proceedings of the …, 2024 - openaccess.thecvf.com
In rapidly-evolving domains such as autonomous driving the use of multiple sensors with
different modalities is crucial to ensure high operational precision and stability. To correctly …

Elasticity meets continuous-time: Map-centric dense 3D LiDAR SLAM

C Park, P Moghadam, JL Williams… - IEEE Transactions …, 2021 - ieeexplore.ieee.org
Map-centric SLAM utilizes elasticity as a means of loop closure. This approach reduces the
cost of loop closure while still providing large-scale fusion-based dense maps, when …

Survey of extrinsic calibration on lidar-camera system for intelligent vehicle: Challenges, approaches, and trends

P An, J Ding, S Quan, J Yang, Y Yang… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
A system with light detection and ranging (LiDAR) and camera (named as LiDAR-camera
system) plays the essential role in intelligent vehicle (IV), for it provides 3D spatial and 2D …