[HTML][HTML] A review of multi-sensor fusion slam systems based on 3D LIDAR

X Xu, L Zhang, J Yang, C Cao, W Wang, Y Ran, Z Tan… - Remote Sensing, 2022 - mdpi.com
The ability of intelligent unmanned platforms to achieve autonomous navigation and
positioning in a large-scale environment has become increasingly demanding, in which …

A review of visual SLAM methods for autonomous driving vehicles

J Cheng, L Zhang, Q Chen, X Hu, J Cai - Engineering Applications of …, 2022 - Elsevier
Autonomous driving vehicles require both a precise localization and map** solution in
different driving environment. In this context, Simultaneous Localization and Map** …

Fast-lio2: Fast direct lidar-inertial odometry

W Xu, Y Cai, D He, J Lin, F Zhang - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
This article presents FAST-LIO2: a fast, robust, and versatile LiDAR-inertial odometry
framework. Building on a highly efficient tightly coupled iterated Kalman filter, FAST-LIO2 …

Lvi-sam: Tightly-coupled lidar-visual-inertial odometry via smoothing and map**

T Shan, B Englot, C Ratti, D Rus - 2021 IEEE international …, 2021 - ieeexplore.ieee.org
We propose a framework for tightly-coupled lidar-visual-inertial odometry via smoothing and
map**, LVI-SAM, that achieves real-time state estimation and map-building with high …

Lio-sam: Tightly-coupled lidar inertial odometry via smoothing and map**

T Shan, B Englot, D Meyers, W Wang… - 2020 IEEE/RSJ …, 2020 - ieeexplore.ieee.org
We propose a framework for tightly-coupled lidar inertial odometry via smoothing and
map**, LIO-SAM, that achieves highly accurate, real-time mobile robot trajectory …

Fast-lio: A fast, robust lidar-inertial odometry package by tightly-coupled iterated kalman filter

W Xu, F Zhang - IEEE Robotics and Automation Letters, 2021 - ieeexplore.ieee.org
This letter presents a computationally efficient and robust LiDAR-inertial odometry
framework. We fuse LiDAR feature points with IMU data using a tightly-coupled iterated …

Dsec: A stereo event camera dataset for driving scenarios

M Gehrig, W Aarents, D Gehrig… - IEEE Robotics and …, 2021 - ieeexplore.ieee.org
Once an academic venture, autonomous driving has received unparalleled corporate
funding in the last decade. Still, operating conditions of current autonomous cars are mostly …

Milestones in autonomous driving and intelligent vehicles—Part II: Perception and planning

L Chen, S Teng, B Li, X Na, Y Li, Z Li… - … on Systems, Man …, 2023 - ieeexplore.ieee.org
A growing interest in autonomous driving (AD) and intelligent vehicles (IVs) is fueled by their
promise for enhanced safety, efficiency, and economic benefits. While previous surveys …

R LIVE: A Robust, Real-Time, LiDAR-Inertial-Visual Tightly-Coupled State Estimator and Map**

J Lin, C Zheng, W Xu, F Zhang - IEEE Robotics and Automation …, 2021 - ieeexplore.ieee.org
In this letter, we propose a robust, real-time tightly-coupled multi-sensor fusion framework,
which fuses measurements from LiDAR, inertial sensor, and visual camera to achieve robust …

Direct lidar odometry: Fast localization with dense point clouds

K Chen, BT Lopez… - IEEE Robotics and …, 2022 - ieeexplore.ieee.org
Field robotics in perceptually-challenging environments require fast and accurate state
estimation, but modern LiDAR sensors quickly overwhelm current odometry algorithms. To …