A comprehensive overview of core modules in visual SLAM framework

D Cai, R Li, Z Hu, J Lu, S Li, Y Zhao - Neurocomputing, 2024 - Elsevier
Abstract Visual Simultaneous Localization and Map** (VSLAM) technology has become a
key technology in autonomous driving and robot navigation. Relying on camera sensors …

State of the art in vision-based localization techniques for autonomous navigation systems

Y Alkendi, L Seneviratne, Y Zweiri - IEEE Access, 2021 - ieeexplore.ieee.org
Vision-based localization systems, namely visual odometry (VO) and visual inertial odometry
(VIO), have attracted great attention recently. They are regarded as critical modules for …

[PDF][PDF] LOAM: Lidar odometry and map** in real-time.

J Zhang, S Singh - Robotics: Science and systems, 2014 - ri.cmu.edu
We propose a real-time method for odometry and map** using range measurements from
a 2-axis lidar moving in 6-DOF. The problem is hard because the range measurements are …

A survey on odometry for autonomous navigation systems

SAS Mohamed, MH Haghbayan, T Westerlund… - IEEE …, 2019 - ieeexplore.ieee.org
The development of a navigation system is one of the major challenges in building a fully
autonomous platform. Full autonomy requires a dependable navigation capability not only in …

SLAM; definition and evolution

H Taheri, ZC ** (SLAM) is a key problem in the field of
Artificial Intelligence and mobile robotics that addresses the problem of localization and …

Keyframe-based visual–inertial odometry using nonlinear optimization

S Leutenegger, S Lynen, M Bosse… - … Journal of Robotics …, 2015 - journals.sagepub.com
Combining visual and inertial measurements has become popular in mobile robotics, since
the two sensing modalities offer complementary characteristics that make them the ideal …

A benchmark for the evaluation of RGB-D SLAM systems

J Sturm, N Engelhard, F Endres… - 2012 IEEE/RSJ …, 2012 - ieeexplore.ieee.org
In this paper, we present a novel benchmark for the evaluation of RGB-D SLAM systems. We
recorded a large set of image sequences from a Microsoft Kinect with highly accurate and …

Laser–visual–inertial odometry and map** with high robustness and low drift

J Zhang, S Singh - Journal of field robotics, 2018 - Wiley Online Library
We present a data processing pipeline to online estimate ego‐motion and build a map of the
traversed environment, leveraging data from a 3D laser scanner, a camera, and an inertial …

High-precision, consistent EKF-based visual-inertial odometry

M Li, AI Mourikis - The International Journal of Robotics …, 2013 - journals.sagepub.com
In this paper, we focus on the problem of motion tracking in unknown environments using
visual and inertial sensors. We term this estimation task visual–inertial odometry (VIO), in …

Visual odometry [tutorial]

D Scaramuzza, F Fraundorfer - IEEE robotics & automation …, 2011 - ieeexplore.ieee.org
Visual odometry (VO) is the process of estimating the egomotion of an agent (eg, vehicle,
human, and robot) using only the input of a single or multiple cameras attached to it …