BEV-V2X: Cooperative birds-eye-view fusion and grid occupancy prediction via V2X-based data sharing

C Chang, J Zhang, K Zhang, W Zhong… - IEEE Transactions …, 2023 - ieeexplore.ieee.org
Birds-Eye-View (BEV) perception can naturally represent natural scenes, which is conducive
to multimodal data processing and fusion. BEV data contain rich semantics and integrate the …

MetaScenario: A framework for driving scenario data description, storage and indexing

C Chang, D Cao, L Chen, K Su, K Su… - IEEE Transactions …, 2022 - ieeexplore.ieee.org
Autonomous driving related researches require the analysis and usage of massive amounts
of driving scenario data. Compared to raw data collected by sensors, scenario data provide …

Deep learning for camera calibration and beyond: A survey

K Liao, L Nie, S Huang, C Lin, J Zhang, Y Zhao… - arxiv preprint arxiv …, 2023 - arxiv.org
Camera calibration involves estimating camera parameters to infer geometric features from
captured sequences, which is crucial for computer vision and robotics. However …

Survey of extrinsic calibration on lidar-camera system for intelligent vehicle: Challenges, approaches, and trends

P An, J Ding, S Quan, J Yang, Y Yang… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
A system with light detection and ranging (LiDAR) and camera (named as LiDAR-camera
system) plays the essential role in intelligent vehicle (IV), for it provides 3D spatial and 2D …

A Review of Deep Learning-Based LiDAR and Camera Extrinsic Calibration

Z Tan, X Zhang, S Teng, L Wang, F Gao - Sensors, 2024 - mdpi.com
Extrinsic parameter calibration is the foundation and prerequisite for LiDAR and camera
data fusion of the autonomous system. This technology is widely used in fields such as …

Robust LiDAR-camera alignment with modality adapted local-to-global representation

A Zhu, Y **ao, C Liu, Z Cao - … on Circuits and Systems for Video …, 2022 - ieeexplore.ieee.org
LiDAR-Camera alignment (LCA) is an important preprocessing procedure for fusing LiDAR
and camera data. For it, one key issue is to extract unified cross-modality representation for …

Targetless Lidar-camera Calibration via Cross-modality Structure Consistency

N Ou, H Cai, J Wang - IEEE Transactions on Intelligent …, 2023 - ieeexplore.ieee.org
Lidar and cameras serve as essential sensors for automated vehicles and intelligent robots,
and they are frequently fused in complicated tasks. Precise extrinsic calibration is the …

A Robust LiDAR-Camera Self-Calibration Via Rotation-Based Alignment and Multi-Level Cost Volume

Z Duan, X Hu, J Ding, P An, X Huang… - IEEE Robotics and …, 2023 - ieeexplore.ieee.org
Multi-sensor collaborative perception has been a significant trend in self-driving and robot
navigation. The precondition for multi-sensor fusion is the accurate calibration between …

[HTML][HTML] A Review of Environmental Perception Technology Based on Multi-Sensor Information Fusion in Autonomous Driving

B Yang, J Li, T Zeng - World Electric Vehicle Journal, 2025 - mdpi.com
Environmental perception is a key technology for autonomous driving, enabling vehicles to
analyze and interpret their surroundings in real time to ensure safe navigation and decision …

Enhancing camera calibration for traffic surveillance with an integrated approach of genetic algorithm and particle swarm optimization

S Li, HS Yoon - Sensors, 2024 - mdpi.com
Recent advancements in sensor technologies, coupled with signal processing and machine
learning, have enabled real-time traffic control systems to effectively adapt to changing traffic …