Publication

Accurate and linear time pose estimation from points and lines

Conference Article

Conference

European Conference on Computer Vision (ECCV)

Edition

14th

Pages

583-599

Doc link

http://dx.doi.org/10.1007/978-3-319-46478-7_36

File

Download the digital copy of the doc pdf document

Abstract

The Perspective-n-Point (PnP) problem seeks to estimate the pose of a calibrated camera from n 3D-to-2D point correspondences. There are situations, though, where PnP solutions are prone to fail because feature point correspondences cannot be reliably estimated (e.g. scenes with repetitive patterns or with low texture). In such scenarios, one can still exploit alternative geometric entities, such as lines, yielding the so-called Perspective-n-Line (PnL) algorithms. Unfortunately, existing PnL solutions are not as accurate and efficient as their point-based counterparts. In this paper we propose a novel approach to introduce 3D-to-2D line correspondences into a PnP formulation, allowing to simultaneously process points and lines. For this purpose we introduce an algebraic line error that can be formulated as linear constraints on the line endpoints, even when these are not directly observable. These constraints can then be naturally integrated within the linear formulations of two state-of-the-art point-based algorithms, the OPnP and the EPnP, allowing them to indistinctly handle points, lines, or a combination of them. Exhaustive experiments show that the proposed formulation brings remarkable boost in performance compared to only point or only line based solutions, with a negligible computational overhead compared to the original OPnP and EPnP.

Categories

computer vision.

Author keywords

Camera calibration

Scientific reference

A. Vakhitov, J. Funke and F. Moreno-Noguer. Accurate and linear time pose estimation from points and lines, 14th European Conference on Computer Vision, 2016, Amsterdam, Netherlands, in Computer Vision - ECCV 2016, Vol 9911 of Lecture Notes in Computer Science, pp. 583-599, 2016.