Event-based optical flow: Method categorisation and review of techniques that leverage deep learning
Keywords: neural networks, computer vision, robotics, optical flow, deep learning, Event camera
Abstract
Developing new convolutional neural network architectures and event-based camera representations could play a crucial role in autonomous navigation, pose estimation, and visual odometry applications. This study explores the potential of event cameras in optical flow estimation using convolutional neural networks. We provide a detailed description of the principles of operation and the software available for extracting and processing information from event cameras, along with the various event representation methods offered by this technology. Likewise, we identify four method categories to estimate optical flow using event cameras: gradient-based, frequency-based, correlation-based and neural network models. We report on these categories, including their latest developments, current status and challenges. We provide information on existing datasets and identify the appropriate dataset to evaluate deep learning-based optical flow estimation methods. We evaluate the accuracy of the implemented methods using the average endpoint error metric; meanwhile, the efficiency of the algorithms is evaluated as a function of execution time. Finally, we discuss research directions that promise future advances in this field.
Más información
Título según WOS: | Event-based optical flow: Method categorisation and review of techniques that leverage deep learning |
Volumen: | 635 |
Fecha de publicación: | 2025 |
Idioma: | English |
DOI: |
10.1016/j.neucom.2025.129899 |
Notas: | ISI |