Results 201 to 210 of about 737,910 (247)
Some of the next articles are maybe not open access.
Extrinsic Calibration of Camera to LIDAR Using a Differentiable Checkerboard Model
IEEE/RJS International Conference on Intelligent RObots and Systems, 2023Multi-modal sensing often involves determining correspondences between each domain's signals, which in turn depends on the accurate extrinsic calibration of the sensors.
L. Fu, Nived Chebrolu, Maurice F. Fallon
semanticscholar +1 more source
IEEE Robotics and Automation Letters, 2023
The strategy of fusing multi-model data especially from cameras, light detection and ranging sensors (LiDAR), is frequently considered in robotics to enhance the performance of the perception and navigation tasks.
Feiyi Chen +4 more
semanticscholar +1 more source
The strategy of fusing multi-model data especially from cameras, light detection and ranging sensors (LiDAR), is frequently considered in robotics to enhance the performance of the perception and navigation tasks.
Feiyi Chen +4 more
semanticscholar +1 more source
Automatic Extrinsic Calibration of Dual LiDARs With Adaptive Surface Normal Estimation
IEEE Transactions on Instrumentation and Measurement, 2023Solutions equipped with multiple light detection and ranging (LiDAR) systems have been widely used in several fields including mobile mapping, navigation, robot, and others. Accurate and robust extrinsic calibration between multiple scanners is necessary
Mingyan Nie +3 more
semanticscholar +1 more source
HD-Map Aided LiDAR-INS Extrinsic Calibration
2021 IEEE International Intelligent Transportation Systems Conference (ITSC), 2021Sensor calibration is a prerequisite for autonomous driving and is vital for accurate perception, ensuring precise planning and control of the autonomous vehicle. The modern self-driving car considers data inputs from multiple sensors to construct an understanding of its surroundings.
Henry Wong +5 more
openaire +1 more source
IEEE Sensors Journal, 2023
The fusion of LiDAR and camera data is a promising approach for improving the environmental perception and recognition abilities of robots. The fusion of data from the two sensors plays a vital role in enhancing robotic localization capabilities. Current
Haitao Liu +4 more
semanticscholar +1 more source
The fusion of LiDAR and camera data is a promising approach for improving the environmental perception and recognition abilities of robots. The fusion of data from the two sensors plays a vital role in enhancing robotic localization capabilities. Current
Haitao Liu +4 more
semanticscholar +1 more source
Extrinsic Multi Sensor Calibration under Uncertainties
2019 IEEE Intelligent Transportation Systems Conference (ITSC), 2019Highly accurate extrinsic sensor calibration is crucial for environment perception of robots as it allows to fuse information from different sensors. On todays robotic platforms, e.g. autonomous cars, a variety of different sensors with different measurement characteristics is used.
Kühner, Tilman, Kümmerle, Julius
openaire +2 more sources
Fast Extrinsic Calibration for 3D LIDAR
Proceedings of the 2019 4th International Conference on Automation, Control and Robotics Engineering, 20193D LIDAR based environmental perception has been widely used in the field of robotics research especially for the rapid development of the vehicle's auto pilot. Coordinate transformation calibration between LIDAR and the self-driving vehicle's body is a prerequisite for environmental perception.
Ning Li, Tao Luo, Bo Su
openaire +1 more source
Observability-Aware Active Extrinsic Calibration of Multiple Sensors
IEEE International Conference on Robotics and Automation, 2023The extrinsic parameters play a crucial role in multi-sensor fusion, such as visual-inertial Simultaneous Localization and Mapping(SLAM), as they enable the accurate alignment and integration of measurements from different sensors.
S. Xu +5 more
semanticscholar +1 more source
Calibration for Camera-Motion Capture Extrinsics
2018 International Conference on Image and Vision Computing New Zealand (IVCNZ), 2018Motion capture is commonly used to track the 3D pose of a camera in order to provide accurate ground truth data for computer vision algorithms such as visual odometry, SLAM and object tracking. However, it is challenging to manually align the coordinate frame of a camera and a motion capture-tracked object.
Sam D. Schofield +2 more
openaire +1 more source
IEEE Transactions on Intelligent Vehicles
Robust and reliable calibration forms the foundation of efficient multi-sensor fusion. Most existing calibration methods are offline and rely on artificial targets, which is time consuming and unfriendly to non-expert users.
Youwei Wang +5 more
semanticscholar +1 more source
Robust and reliable calibration forms the foundation of efficient multi-sensor fusion. Most existing calibration methods are offline and rely on artificial targets, which is time consuming and unfriendly to non-expert users.
Youwei Wang +5 more
semanticscholar +1 more source

