Towards SLAM with an events-based camera

Technical Report (2016)

IRI code



Download the digital copy of the doc pdf document



Event-based cameras have an incredible potential in real-time and real-world robotics. They would enable more efficient algorithms in applications where high demanding requirements, such as rapid dynamic motion and high dynamic range, make standard cameras run into problems rapid dynamic motion and high dynamic range. While traditional cameras are based in the frame-base paradigm - a shutter captures a certain amount of pictures per second -, the bio-inspired event cameras have pixels that respond independently to the change of log-intensity generating asynchronous events. An special appeal for this type of cameras is their low band-width, since the stream of events contain all the information getting rid of the redundancy. This sensors that mimic some properties of the human retina has microseconds latency and 120 dB dynamic range (in contrast to the 60 dB of the standard cameras).

However, the current impact of the event cameras has been tiny due to the necessity of completely new algorithm, there is no global measurement of the intensity which would allow the use of current methods. The fact that an event corresponds to an asynchronous local intensity difference turns out to be a challenging problem if one wants to recover the motion as well as the scene. This article tries to illustrate the several problems that are needed to face when dealing with this problem and some of the different approaches taken.

First of all, we will explain the generative model of the event camera and the preliminaries, followed by the different approaches. Finally will the conclusions and a glossary of the code.


computer vision.

Scientific reference

I. Clavera, J. Solà and J. Andrade-Cetto. Towards SLAM with an events-based camera. Technical Report IRI-TR-16-07, Institut de Robòtica i Informàtica Industrial, CSIC-UPC, 2016.