BEV-V2X: Cooperative birds-eye-view fusion and grid occupancy prediction via V2X-based data sharing
Birds-Eye-View (BEV) perception can naturally represent natural scenes, which is conducive
to multimodal data processing and fusion. BEV data contain rich semantics and integrate the …
to multimodal data processing and fusion. BEV data contain rich semantics and integrate the …
MetaScenario: A framework for driving scenario data description, storage and indexing
Autonomous driving related researches require the analysis and usage of massive amounts
of driving scenario data. Compared to raw data collected by sensors, scenario data provide …
of driving scenario data. Compared to raw data collected by sensors, scenario data provide …
Deep learning for camera calibration and beyond: A survey
Camera calibration involves estimating camera parameters to infer geometric features from
captured sequences, which is crucial for computer vision and robotics. However …
captured sequences, which is crucial for computer vision and robotics. However …
Survey of extrinsic calibration on lidar-camera system for intelligent vehicle: Challenges, approaches, and trends
A system with light detection and ranging (LiDAR) and camera (named as LiDAR-camera
system) plays the essential role in intelligent vehicle (IV), for it provides 3D spatial and 2D …
system) plays the essential role in intelligent vehicle (IV), for it provides 3D spatial and 2D …
A Review of Deep Learning-Based LiDAR and Camera Extrinsic Calibration
Z Tan, X Zhang, S Teng, L Wang, F Gao - Sensors, 2024 - mdpi.com
Extrinsic parameter calibration is the foundation and prerequisite for LiDAR and camera
data fusion of the autonomous system. This technology is widely used in fields such as …
data fusion of the autonomous system. This technology is widely used in fields such as …
Robust LiDAR-camera alignment with modality adapted local-to-global representation
LiDAR-Camera alignment (LCA) is an important preprocessing procedure for fusing LiDAR
and camera data. For it, one key issue is to extract unified cross-modality representation for …
and camera data. For it, one key issue is to extract unified cross-modality representation for …
Targetless Lidar-camera Calibration via Cross-modality Structure Consistency
N Ou, H Cai, J Wang - IEEE Transactions on Intelligent …, 2023 - ieeexplore.ieee.org
Lidar and cameras serve as essential sensors for automated vehicles and intelligent robots,
and they are frequently fused in complicated tasks. Precise extrinsic calibration is the …
and they are frequently fused in complicated tasks. Precise extrinsic calibration is the …
A Robust LiDAR-Camera Self-Calibration Via Rotation-Based Alignment and Multi-Level Cost Volume
Multi-sensor collaborative perception has been a significant trend in self-driving and robot
navigation. The precondition for multi-sensor fusion is the accurate calibration between …
navigation. The precondition for multi-sensor fusion is the accurate calibration between …
[HTML][HTML] A Review of Environmental Perception Technology Based on Multi-Sensor Information Fusion in Autonomous Driving
B Yang, J Li, T Zeng - World Electric Vehicle Journal, 2025 - mdpi.com
Environmental perception is a key technology for autonomous driving, enabling vehicles to
analyze and interpret their surroundings in real time to ensure safe navigation and decision …
analyze and interpret their surroundings in real time to ensure safe navigation and decision …
Enhancing camera calibration for traffic surveillance with an integrated approach of genetic algorithm and particle swarm optimization
Recent advancements in sensor technologies, coupled with signal processing and machine
learning, have enabled real-time traffic control systems to effectively adapt to changing traffic …
learning, have enabled real-time traffic control systems to effectively adapt to changing traffic …