Results 21 to 30 of about 1,461,530 (380)
Learning-based view synthesis for light field cameras [PDF]
With the introduction of consumer light field cameras, light field imaging has recently become widespread. However, there is an inherent trade-off between the angular and spatial resolution, and thus, these cameras often sparsely sample in either spatial
N. Kalantari +2 more
semanticscholar +1 more source
Event-Based Motion Segmentation by Motion Compensation [PDF]
In contrast to traditional cameras, whose pixels have a common exposure time, event-based cameras are novel bio-inspired sensors whose pixels work independently and asynchronously output intensity changes (called "events"), with microsecond resolution ...
Drummond, Tom +4 more
core +1 more source
Dynamic obstacle avoidance for quadrotors with event cameras
Micro-aerial vehicles dodge fast-moving objects using only onboard sensing and computation. Today’s autonomous drones have reaction times of tens of milliseconds, which is not enough for navigating fast in complex dynamic environments.
Davide Falanga +2 more
semanticscholar +1 more source
Video to Events: Recycling Video Datasets for Event Cameras [PDF]
Event cameras are novel sensors that output brightness changes in the form of a stream of asynchronous "events" instead of intensity frames. They offer significant advantages with respect to conventional cameras: high dynamic range (HDR), high temporal ...
Daniel Gehrig +3 more
semanticscholar +1 more source
A Unifying Contrast Maximization Framework for Event Cameras, with Applications to Motion, Depth, and Optical Flow Estimation [PDF]
We present a unifying framework to solve several computer vision problems with event cameras: motion, depth and optical flow estimation. The main idea of our framework is to find the point trajectories on the image plane that are best aligned with the ...
Guillermo Gallego +2 more
semanticscholar +1 more source
A unilateral 3D indoor positioning system employing optical camera communications
This article investigates the use of a visible light positioning system in an indoor environment to provide a three dimensional (3D) high‐accuracy solution. The proposed system leveraged the use of a single light‐emitting diode and an image sensor at the
Othman Isam Younus +6 more
doaj +1 more source
Automatic calibration of multiple cameras and lidars for autonomous vehicles
Autonomous navigation of unmanned vehicles (UVs) is currently one of the most interesting scientific and technical problems, and this is even more true for UVs moving across rough terrain.
Y.B. Blokhinov +3 more
doaj +1 more source
BLAST Autonomous Daytime Star Cameras [PDF]
We have developed two redundant daytime star cameras to provide the fine pointing solution for the balloon-borne submillimeter telescope, BLAST. The cameras are capable of providing a reconstructed pointing solution with an absolute accuracy < 5 ...
Chapin, Edward +6 more
core +2 more sources
Robust Intrinsic and Extrinsic Calibration of RGB-D Cameras [PDF]
Color-depth cameras (RGB-D cameras) have become the primary sensors in most robotics systems, from service robotics to industrial robotics applications.
Basso, Filippo +2 more
core +2 more sources
Camera Calibration Through Camera Projection Loss
Camera calibration is a necessity in various tasks including 3D reconstruction, hand-eye coordination for a robotic interaction, autonomous driving, etc. In this work we propose a novel method to predict extrinsic (baseline, pitch, and translation), intrinsic (focal length and principal point offset) parameters using an image pair.
Butt, Talha Hanif, Taj, Murtaza
openaire +2 more sources

