Research Project

EB-SLAM: Event-based simultaneous localization and mapping

Type

National Project

Start Date

01/01/2018

End Date

31/12/2020

Project Code

DPI2017-89564-P

Project illustration

Staff

Project Description

We will develop a high-speed, high-dynamic-range localization and mapping device that fuses inertial measurements with those of a dynamic vision system, most commonly known as event-based camera. Such a device could be used for estimating the movement of an autonomous vehicle or a UAV, in environments without GPS readings, undergoing high dynamics, and under conditions of poor illumination or severe illumination changes.

The problem is multidisciplinary, requiring expertise in computer vision, inertial systems and stochastic estimation. The approach to solve it is unconventional because we will use event cameras. These cameras, unlike conventional ones, provide asynchronous sequences of events at very high speeds of those pixels that change in the scene, rather than full image frames at much lower frequencies. Such type of cameras have very seldom been used previously as proprioceptive sensors to estimate motion, which is the challenge we seek to explore. When fused with an IMU, the device will be able to recover 3D position, orientation, velocities and acceleration with respect to the gravity direction. And since events from the camera come at the microsecond precision, the computed estimates can be published at rates approaching the MHz.

We will develop algorithms to extract the camera motion and the environment structure from the asynchronous sequence of camera events. To integrate the IMUs measurements of angular velocity and acceleration, we will model their systematic biases and make an observability analysis. We will use advanced sparse nonlinear optimization techniques to solve the set of kinematic constraints resulting from IMU integration and the spatial constraints resulting from the observation of the events. We will also address the relative spatial and temporal calibrations between both sensors. The accuracy and performance of our system will be validated against trajectories recorded with an absolute positioning system (Optitrak). We will build prototypes for three specific applications: perception for humanoid locomotion, autonomous high-speed UAV maneuvering, and long-range car dead reckoning. All three applications are subject to abrupt illumination changes and/or high-speed motion, conditions for which event cameras are an adequate sensor choice.

Project Publications

Journal Publications

  • J. Vallvé, J. Solà and J. Andrade-Cetto. Graph SLAM sparsification with populated topologies using factor descent optimization. IEEE Robotics and Automation Letters, 3(2): 1322-1329, 2018.

    Open/Close abstract Abstract Info Info pdf PDF