A review of multi-sensor fusion slam systems based on 3D LIDAR

X Xu, L Zhang, J Yang, C Cao, W Wang, Y Ran, Z Tan… - Remote Sensing, 2022 - mdpi.com
The ability of intelligent unmanned platforms to achieve autonomous navigation and
positioning in a large-scale environment has become increasingly demanding, in which …

Recent advances in bipedal walking robots: Review of gait, drive, sensors and control systems

T Mikolajczyk, E Mikołajewska, HFN Al-Shuka… - Sensors, 2022 - mdpi.com
Currently, there is an intensive development of bipedal walking robots. The most known
solutions are based on the use of the principles of human gait created in nature during …

Super odometry: Imu-centric lidar-visual-inertial estimator for challenging environments

S Zhao, H Zhang, P Wang, L Nogueira… - 2021 IEEE/RSJ …, 2021 - ieeexplore.ieee.org
We propose Super Odometry, a high-precision multi-modal sensor fusion framework,
providing a simple but effective way to fuse multiple sensors such as LiDAR, camera, and …

Unified multi-modal landmark tracking for tightly coupled lidar-visual-inertial odometry

D Wisth, M Camurri, S Das… - IEEE Robotics and …, 2021 - ieeexplore.ieee.org
We present an efficient multi-sensor odometry system for mobile platforms that jointly
optimizes visual, lidar, and inertial information within a single integrated factor graph. This …

VILENS: Visual, inertial, lidar, and leg odometry for all-terrain legged robots

D Wisth, M Camurri, M Fallon - IEEE Transactions on Robotics, 2022 - ieeexplore.ieee.org
We present visual inertial lidar legged navigation system (VILENS), an odometry system for
legged robots based on factor graphs. The key novelty is the tight fusion of four different …

SLAM overview: from single sensor to heterogeneous fusion

W Chen, C Zhou, G Shang, X Wang, Z Li, C Xu, K Hu - Remote Sensing, 2022 - mdpi.com
After decades of development, LIDAR and visual SLAM technology has relatively matured
and been widely used in the military and civil fields. SLAM technology enables the mobile …

A comprehensive overview of core modules in visual SLAM framework

D Cai, R Li, Z Hu, J Lu, S Li, Y Zhao - Neurocomputing, 2024 - Elsevier
Abstract Visual Simultaneous Localization and Map** (VSLAM) technology has become a
key technology in autonomous driving and robot navigation. Relying on camera sensors …

Recent progress in legged robots locomotion control

J Carpentier, PB Wieber - Current Robotics Reports, 2021 - Springer
Purpose of review In recent years, legged robots locomotion has been transitioning from
mostly flat ground in controlled settings to generic indoor and outdoor environments …

Overview of multi-robot collaborative SLAM from the perspective of data fusion

W Chen, X Wang, S Gao, G Shang, C Zhou, Z Li, C Xu… - Machines, 2023 - mdpi.com
In the face of large-scale environmental map** requirements, through the use of
lightweight and inexpensive robot groups to perceive the environment, the multi-robot …

Learning inertial odometry for dynamic legged robot state estimation

R Buchanan, M Camurri, F Dellaert… - Conference on robot …, 2022 - proceedings.mlr.press
This paper introduces a novel proprioceptive state estimator for legged robots based on a
learned displacement measurement from IMU data. Recent research in pedestrian tracking …