FAST-LIO2: Fast Direct LiDAR-Inertial Odometry [PDF]
This article presents FAST-LIO2: a fast, robust, and versatile LiDAR-inertial odometry framework. Building on a highly efficient tightly coupled iterated Kalman filter, FAST-LIO2 has two key novelties that allow fast, robust, and accurate LiDAR ...
Wei Xu +4 more
semanticscholar +1 more source
TransFusion: Robust LiDAR-Camera Fusion for 3D Object Detection with Transformers [PDF]
LiDAR and camera are two important sensors for 3D object detection in autonomous driving. Despite the increasing popularity of sensor fusion in this field, the robustness against inferior image conditions, e.g., bad illumination and sensor misalignment ...
Xuyang Bai +6 more
semanticscholar +1 more source
Spherical Transformer for LiDAR-Based 3D Recognition [PDF]
LiDAR-based 3D point cloud recognition has benefited various applications. Without specially considering the LiDAR point distribution, most current methods suffer from information disconnection and limited receptive field, especially for the sparse ...
Xin Lai +4 more
semanticscholar +1 more source
Rethinking Range View Representation for LiDAR Segmentation [PDF]
LiDAR segmentation is crucial for autonomous driving perception. Recent trends favor point- or voxel-based methods as they often yield better performance than the traditional range view representation.
Lingdong Kong +8 more
semanticscholar +1 more source
BEVFusion: A Simple and Robust LiDAR-Camera Fusion Framework [PDF]
Fusing the camera and LiDAR information has become a de-facto standard for 3D object detection tasks. Current methods rely on point clouds from the LiDAR sensor as queries to leverage the feature from the image space. However, people discovered that this
Tingting Liang +8 more
semanticscholar +1 more source
A Survey on Global LiDAR Localization: Challenges, Advances and Open Problems [PDF]
Knowledge about the own pose is key for all mobile robot applications. Thus pose estimation is part of the core functionalities of mobile robots. Over the last two decades, LiDAR scanners have become the standard sensor for robot localization and mapping.
Huan Yin +7 more
semanticscholar +1 more source
DeepFusion: Lidar-Camera Deep Fusion for Multi-Modal 3D Object Detection [PDF]
Lidars and cameras are critical sensors that provide complementary information for 3D detection in autonomous driving. While prevalent multi-modal methods [34], [36] simply decorate raw lidar point clouds with camera features and feed them directly to ...
Yingwei Li +12 more
semanticscholar +1 more source
NeRF-LOAM: Neural Implicit Representation for Large-Scale Incremental LiDAR Odometry and Mapping [PDF]
Simultaneously odometry and mapping using LiDAR data is an important task for mobile systems to achieve full autonomy in large-scale environments.
Junyuan Deng +6 more
semanticscholar +1 more source
General, Single-shot, Target-less, and Automatic LiDAR-Camera Extrinsic Calibration Toolbox [PDF]
This paper presents an open source LiDAR-camera calibration toolbox that is general to LiDAR and cam-era projection models, requires only one pairing of LiDAR and camera data without a calibration target, and is fully automatic.
Kenji Koide +3 more
semanticscholar +1 more source
LVI-SAM: Tightly-coupled Lidar-Visual-Inertial Odometry via Smoothing and Mapping [PDF]
We propose a framework for tightly-coupled lidar-visual-inertial odometry via smoothing and mapping, LVI-SAM, that achieves real-time state estimation and map-building with high accuracy and robustness.
Tixiao Shan +3 more
semanticscholar +1 more source

