Event-based vision: A survey

G Gallego, T Delbrück, G Orchard… - IEEE transactions on …, 2020 - ieeexplore.ieee.org
Event cameras are bio-inspired sensors that differ from conventional frame cameras: Instead
of capturing images at a fixed rate, they asynchronously measure per-pixel brightness …

Computer vision for autonomous vehicles: Problems, datasets and state of the art

J Janai, F Güney, A Behl, A Geiger - Foundations and Trends® …, 2020 - nowpublishers.com
Recent years have witnessed enormous progress in AI-related fields such as computer
vision, machine learning, and autonomous vehicles. As with any rapidly growing field, it …

Past, present, and future of simultaneous localization and map**: Toward the robust-perception age

C Cadena, L Carlone, H Carrillo, Y Latif… - IEEE Transactions …, 2016 - ieeexplore.ieee.org
Simultaneous localization and map** (SLAM) consists in the concurrent construction of a
model of the environment (the map), and the estimation of the state of the robot moving …

A review of visual SLAM methods for autonomous driving vehicles

J Cheng, L Zhang, Q Chen, X Hu, J Cai - Engineering Applications of …, 2022 - Elsevier
Autonomous driving vehicles require both a precise localization and map** solution in
different driving environment. In this context, Simultaneous Localization and Map** …

Computing systems for autonomous driving: State of the art and challenges

L Liu, S Lu, R Zhong, B Wu, Y Yao… - IEEE Internet of …, 2020 - ieeexplore.ieee.org
The recent proliferation of computing technologies (eg, sensors, computer vision, machine
learning, and hardware acceleration) and the broad deployment of communication …

A general optimization-based framework for global pose estimation with multiple sensors

T Qin, S Cao, J Pan, S Shen - arxiv preprint arxiv:1901.03642, 2019 - arxiv.org
Accurate state estimation is a fundamental problem for autonomous robots. To achieve
locally accurate and globally drift-free state estimation, multiple sensors with complementary …

Event-based vision meets deep learning on steering prediction for self-driving cars

AI Maqueda, A Loquercio, G Gallego… - Proceedings of the …, 2018 - openaccess.thecvf.com
Event cameras are bio-inspired vision sensors that naturally capture the dynamics of a
scene, filtering out redundant information. This paper presents a deep neural network …

Unsupervised event-based learning of optical flow, depth, and egomotion

AZ Zhu, L Yuan, K Chaney… - Proceedings of the …, 2019 - openaccess.thecvf.com
In this work, we propose a novel framework for unsupervised learning for event cameras that
learns motion information from only the event stream. In particular, we propose an input …

The event-camera dataset and simulator: Event-based data for pose estimation, visual odometry, and SLAM

E Mueggler, H Rebecq, G Gallego… - … Journal of Robotics …, 2017 - journals.sagepub.com
New vision sensors, such as the dynamic and active-pixel vision sensor (DAVIS),
incorporate a conventional global-shutter camera and an event-based sensor in the same …

M2dgr: A multi-sensor and multi-scenario slam dataset for ground robots

J Yin, A Li, T Li, W Yu, D Zou - IEEE Robotics and Automation …, 2021 - ieeexplore.ieee.org
We introduce M2DGR: a novel large-scale dataset collected by a ground robot with a full
sensor-suite including six fish-eye and one sky-pointing RGB cameras, an infrared camera …