Research Project

EBCON: Motion estimation and control with event cameras

Type

National Project

Start Date

01/09/2021

End Date

31/08/2024

Project Code

PID2020-119244GB-I00

Project illustration

Staff

Project Description

Project PID2020-119244GB-I00 funded by MCIN/ AEI /10.13039/501100011033

Agile motion control for high-dynamic robotics applications relies on two key elements: a very fast and reliable system to accurately compute motion estimates, and a fast and reliable mechanism to compute and execute motion control commands.
Event cameras are bio-inspired silicon retinas that detect independently the change of luminance in each pixel, and produce an asynchronous feed of pixel coordinates where there has been change, called events. Since no frame or image is produced, events can be detected and transmitted in the order of microseconds. In this project, we will investigate methods for fast, robust and accurate estimation of relevant figures of interest regarding dynamics, from camera motion to environment structure, while also exploring ways to use these fast estimators to produce commands for fast motion control. The objective is to provide tools for a complete perception-action pipeline suitable for the most demanding high-dynamics robotic platforms such as humanoids, legged robots, aerial vehicles and aerial manipulators.
Regarding perception, EBCON is the second project we undertake at IRI devoted to event-based vision. In the first project, EBSLAM, our efforts were in recovering a good estimate of camera motion whilst building a map of the environment. Today we are capable of estimating camera motion at 10kHz rate, for motion bursts with accelerations of 20g, and with precision comparable to that of an Optitrack system. In EBCON we will continue our efforts on SLAM for event cameras, being able to build the map automatically and in real-time. Secondly, we will resort to the use of artificial neural networks to learn discriminative features from events. Moreover, we will also investigate the use of spiking neural nets to compute salient features from events, compute optical flow, and recover camera motion.
In biology, most information processing happens through spike-based representations: spikes encode sensory data, spikes perform computation, and spikes transmit actuator commands. Therefore, the event-based paradigm seems applicable not only to perception and inference, but also to control. In EBCON we will explore the design of reactive controllers requiring fast reaction times, such as obstacle avoidance for UAVs. However, modern high-dynamics robots such as humanoids or aerial manipulators require the execution of carefully planned maneuvers and cannot resort just to reactive control. We need to step up to deliberative control techniques such as nonlinear model predictive control (nMPC). In this project, we want to explore techniques that, departing from the most modern nMPC techniques, allow us to approach the dynamics observed by our event-based estimators.
In the course of EBCON, results will be demonstrated in a number of robotic prototypes with increasing levels of complexity. Motion estimation will be developed first for a hand-held camera moving freely in 3D, for a UAV moving also freely in 3D but including also its kinematic model. In the same way, the control algorithms will be first developed for simulated settings, then for a simple 2DoF arm, for UAV control, for the constrained motion of IRIs Ivo dual-arm mobile manipulator, and finally for the complex case of the LAAS humanoid robot TALOS with 32 DoFs. Two robotics companies have shown interest in the project objectives and are expectant of its outcome. When time comes we will seek the transfer of project results to their platforms.

Project Publications

Journal Publications

  • I. del Pino, A. Santamaria-Navarro, A. Garrell Zulueta, F. Torres and J. Andrade-Cetto. Probabilistic graph-based real-time ground segmentation for urban robotics. IEEE Transactions on Intelligent Vehicles, 2024, to appear.

    Open/Close abstract Abstract Info Info pdf PDF
  • W.O. Chamorro, J. Solà and J. Andrade-Cetto. Event-based line SLAM in real-time. IEEE Robotics and Automation Letters, 7(3): 8146-8153, 2022.

    Open/Close abstract Abstract Info Info pdf PDF
  • M. Saboia Da Silva, L. Clark, V. Thangavelu, J. Edlund, K. Otsu, G.J. Correa, V.S. Varadharajan, A. Santamaria-Navarro, T. Touma, A. Bouman, H. Melikyan, T. Pailevanian, S. Kim, A. Archanian, T.S. Vaquero, G. Beltrame, N. Napp, G. Pessin and A. Agha-mohammadi. ACHORD: Communication-aware multi-robot coordination with intermittent connectivity. IEEE Robotics and Automation Letters, 7(4): 10184-10191, 2022.

    Open/Close abstract Abstract Info Info pdf PDF
  • J. Solà, J. Vallvé, J. Casals, J. Deray, M. Fourmy, D. Atchuthan, A. Corominas Murtra and J. Andrade-Cetto. WOLF: A modular estimation framework for robotics based on factor graphs. IEEE Robotics and Automation Letters, 7(2): 4710-4717, 2022.

    Open/Close abstract Abstract Info Info pdf PDF

Conference Publications

  • W.O. Chamorro, J. Solà and J. Andrade-Cetto. Event-IMU fusion strategies for faster-than-IMU estimation throughput, 4th CVPR International Workshop on Event Vision, 2023, Vancouver, pp. 3975-3982.

    Open/Close abstract Abstract Info Info pdf PDF
  • Y. Tian and J. Andrade-Cetto. Egomotion from event-based SNN optical flow, 2023 ACM International Conference on Neuromorphic Systems, 2023, Santa Fe, NM, USA, pp. 8:1-8.

    Open/Close abstract Abstract Info Info pdf PDF
  • Y. Tian and J. Andrade-Cetto. Spiking neural network for event camera ego-motion estimation, 2022 nanoGe MAT-SUS Symposium on Neuromorphic Sensory-Processing-Learning Systems inspired in Computational Neuroscience, 2022, Barcelona.

    Open/Close abstract Abstract Info Info pdf PDF
  • Y. Tian and J. Andrade-Cetto. Event transformer FlowNet for optical flow estimation, 2022 British Machine Vision Conference, 2022, London.

    Open/Close abstract Abstract Info Info pdf PDF