Pose estimation of a mobile robot based on fusion of IMU data and vision data using an extended Kalman filter

MB Alatise, GP Hancke - Sensors, 2017 - mdpi.com
Using a single sensor to determine the pose estimation of a device cannot give accurate
results. This paper presents a fusion of an inertial sensor of six degrees of freedom (6-DoF) …

High-performance embedded computing in space: Evaluation of platforms for vision-based navigation

G Lentaris, K Maragos, I Stratakos… - Journal of Aerospace …, 2018 - arc.aiaa.org
Vision-based navigation has become increasingly important in a variety of space
applications for enhancing autonomy and dependability. Future missions, such as active …

Hybrid indoor localization using IMU sensors and smartphone camera

A Poulose, DS Han - Sensors, 2019 - mdpi.com
Smartphone camera or inertial measurement unit (IMU) sensor-based systems can be
independently used to provide accurate indoor positioning results. However, the accuracy of …

[HTML][HTML] Vision-based navigation and obstacle detection flight results in SLIM lunar landing

T Ishida, S Fukuda, K Kariya, H Kamata, K Takadama… - Acta Astronautica, 2025 - Elsevier
Abstract On January 20, 2024, Smart Lander for Investigating Moon (SLIM) landed on the
Moon. Vision-Based Navigation (VBN) was used to estimate the position of the spacecraft …

On-board absolute localization based on orbital imagery for a future mars science helicopter

R Brockers, P Proença, J Delaune… - 2022 IEEE …, 2022 - ieeexplore.ieee.org
Future Mars Rotorcraft require advanced navigation capabilities to enable all terrain access
over long distance flights that are executed fully autonomously. A critical component to …

Real-time crater-based monocular 3-D pose tracking for planetary landing and navigation

C Liu, W Guo, W Hu, R Chen… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
This article proposes a vision-based framework to track the pose of lander in real time during
planetary exploration with craters as landmarks. The contour of landmark crater is …

Vision-aided inertial navigation for planetary landing without feature extraction and matching

J Hu, Z **g, S Li, M **n - Acta Astronautica, 2024 - Elsevier
In this paper, a vision-aided inertial navigation method for planetary landing based on the
direct method is proposed. This approach requires only a monocular camera and an inertial …

Visual navigation based on curve matching for planetary landing in unknown environments

P Cui, X Gao, S Zhu, W Shao - Acta Astronautica, 2020 - Elsevier
For planetary landing missions, accurate surface relative state knowledge to safely land
near science targets is required. However, the lack of on-board map poses a technical …

Integrated celestial navigation for spacecraft using interferometer and Earth sensor

K **ong, C Wei - … of the Institution of Mechanical Engineers …, 2020 - journals.sagepub.com
An integrated celestial navigation scheme for spacecrafts based on an optical interferometer
and an ultraviolet Earth sensor is presented in this paper. The optical interferometer is …

Illumination Robust Landing Point Visual Localization for Lunar Lander With High-Resolution Map Generation

X Tong, Y Feng, Z Ye, T Li, X Xu, H **e… - IEEE Journal of …, 2024 - ieeexplore.ieee.org
Landing point localization is of great significance to the lunar exploration engineering and
scientific missions. Vision-based landing point localization methods have successfully been …