Publication
Hallucinating dense optical flow from sparse lidar for autonomous vehicles
Conference Article
Conference
International Conference on Pattern Recognition (ICPR)
Edition
24th
Pages
1959-1964
Doc link
https://doi.org/10.1109/ICPR.2018.8545803
File
Authors
Projects associated
Abstract
In this paper we propose a novel approach to estimate dense optical flow from sparse lidar data acquired on an autonomous vehicle. This is intended to be used as a drop-in replacement of any image-based optical flow system when images are not reliable due to e.g. adverse weather conditions or at night. In order to infer high resolution 2D flows from discrete range data we devise a three-block architecture of multiscale filters that combines multiple intermediate objectives, both in the lidar and image domain. To train this network we introduce a dataset with approximately 20K lidar samples of the Kitti dataset which we have augmented with a pseudo ground-truth image-based optical flow computed using FlowNet2. We demonstrate the effectiveness of our approach on Kitti, and show that despite using the low-resolution and sparse measurements of the lidar, we can regress dense optical flow maps which are at par with those estimated with image-based methods.
Categories
computer vision, feature extraction, pattern recognition.
Author keywords
Deep Lidar, Lidar-flow, Autonomous Driving
Scientific reference
V. Vaquero, A. Sanfeliu and F. Moreno-Noguer. Hallucinating dense optical flow from sparse lidar for autonomous vehicles, 24th International Conference on Pattern Recognition, 2018, Beijing, China, pp. 1959-1964.
Follow us!