Event-based vision: A survey
Event cameras are bio-inspired sensors that differ from conventional frame cameras: Instead
of capturing images at a fixed rate, they asynchronously measure per-pixel brightness …
of capturing images at a fixed rate, they asynchronously measure per-pixel brightness …
Computer vision for autonomous vehicles: Problems, datasets and state of the art
Recent years have witnessed enormous progress in AI-related fields such as computer
vision, machine learning, and autonomous vehicles. As with any rapidly growing field, it …
vision, machine learning, and autonomous vehicles. As with any rapidly growing field, it …
Past, present, and future of simultaneous localization and map**: Toward the robust-perception age
Simultaneous localization and map** (SLAM) consists in the concurrent construction of a
model of the environment (the map), and the estimation of the state of the robot moving …
model of the environment (the map), and the estimation of the state of the robot moving …
A review of visual SLAM methods for autonomous driving vehicles
Autonomous driving vehicles require both a precise localization and map** solution in
different driving environment. In this context, Simultaneous Localization and Map** …
different driving environment. In this context, Simultaneous Localization and Map** …
Computing systems for autonomous driving: State of the art and challenges
The recent proliferation of computing technologies (eg, sensors, computer vision, machine
learning, and hardware acceleration) and the broad deployment of communication …
learning, and hardware acceleration) and the broad deployment of communication …
A general optimization-based framework for global pose estimation with multiple sensors
Accurate state estimation is a fundamental problem for autonomous robots. To achieve
locally accurate and globally drift-free state estimation, multiple sensors with complementary …
locally accurate and globally drift-free state estimation, multiple sensors with complementary …
Event-based vision meets deep learning on steering prediction for self-driving cars
Event cameras are bio-inspired vision sensors that naturally capture the dynamics of a
scene, filtering out redundant information. This paper presents a deep neural network …
scene, filtering out redundant information. This paper presents a deep neural network …
Unsupervised event-based learning of optical flow, depth, and egomotion
In this work, we propose a novel framework for unsupervised learning for event cameras that
learns motion information from only the event stream. In particular, we propose an input …
learns motion information from only the event stream. In particular, we propose an input …
The event-camera dataset and simulator: Event-based data for pose estimation, visual odometry, and SLAM
New vision sensors, such as the dynamic and active-pixel vision sensor (DAVIS),
incorporate a conventional global-shutter camera and an event-based sensor in the same …
incorporate a conventional global-shutter camera and an event-based sensor in the same …
M2dgr: A multi-sensor and multi-scenario slam dataset for ground robots
We introduce M2DGR: a novel large-scale dataset collected by a ground robot with a full
sensor-suite including six fish-eye and one sky-pointing RGB cameras, an infrared camera …
sensor-suite including six fish-eye and one sky-pointing RGB cameras, an infrared camera …