File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Monocular Event Visual Inertial Odometry based on Event-corner using Sliding Windows Graph-based Optimization

TitleMonocular Event Visual Inertial Odometry based on Event-corner using Sliding Windows Graph-based Optimization
Authors
Issue Date23-Oct-2022
PublisherIEEE
Abstract

Event cameras are biologically-inspired vision sensors that capture pixel-level illumination changes instead of the intensity image at a fixed frame rate. They offer many advantages over the standard cameras, such as high dynamic range, high temporal resolution (low latency), no motion blur, etc. Therefore, developing state estimation algorithms based on event cameras offers exciting opportunities for autonomous systems and robots. In this paper, we propose monocular visual-inertial odometry for event cameras based on event-corner feature detection and matching with well-designed feature management. More specifically, two different kinds of event representations based on time surface are designed to realize event-corner feature tracking (for front-end incremental estimation) and matching (for loop closure detection). Furthermore, the proposed event representations are used to set mask for detecting the event-corner feature based on the raw event-stream, which ensures the uniformly distributed and spatial consistency characteristic of the event-corner feature. Finally, a tightly coupled, graph-based optimization framework is designed to obtain high-accurate state estimation through fusing pre-integrated IMU measurements and event-corner observations. We validate quantitatively the performance of our system on different resolution event cameras: DAVIS240C (240*180, public dataset, achieve state-of-the-art), DAVIS346 (346*240, real-test), DVXplorer (640*480 real-test). Furthermore, we demonstrate qualitatively the accuracy, robustness, loop closure, and re-localization performance of our framework on different large-scale datasets, and an autonomous quadrotor flight using our Event Visual-inertial Odometry (EVIO) framework. Videos of all the evaluations are presented on the project website.


Persistent Identifierhttp://hdl.handle.net/10722/339568
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorGuan, Weipeng-
dc.contributor.authorLu, Peng-
dc.date.accessioned2024-03-11T10:37:42Z-
dc.date.available2024-03-11T10:37:42Z-
dc.date.issued2022-10-23-
dc.identifier.urihttp://hdl.handle.net/10722/339568-
dc.description.abstract<p>Event cameras are biologically-inspired vision sensors that capture pixel-level illumination changes instead of the intensity image at a fixed frame rate. They offer many advantages over the standard cameras, such as high dynamic range, high temporal resolution (low latency), no motion blur, etc. Therefore, developing state estimation algorithms based on event cameras offers exciting opportunities for autonomous systems and robots. In this paper, we propose monocular visual-inertial odometry for event cameras based on event-corner feature detection and matching with well-designed feature management. More specifically, two different kinds of event representations based on time surface are designed to realize event-corner feature tracking (for front-end incremental estimation) and matching (for loop closure detection). Furthermore, the proposed event representations are used to set mask for detecting the event-corner feature based on the raw event-stream, which ensures the uniformly distributed and spatial consistency characteristic of the event-corner feature. Finally, a tightly coupled, graph-based optimization framework is designed to obtain high-accurate state estimation through fusing pre-integrated IMU measurements and event-corner observations. We validate quantitatively the performance of our system on different resolution event cameras: DAVIS240C (240*180, public dataset, achieve state-of-the-art), DAVIS346 (346*240, real-test), DVXplorer (640*480 real-test). Furthermore, we demonstrate qualitatively the accuracy, robustness, loop closure, and re-localization performance of our framework on different large-scale datasets, and an autonomous quadrotor flight using our Event Visual-inertial Odometry (EVIO) framework. Videos of all the evaluations are presented on the project website.<br></p>-
dc.languageeng-
dc.publisherIEEE-
dc.relation.ispartof2022 IEEE/RSJ International Conference on Intelligent Robots and Systems - IROS (23/10/2022-27/10/2022, , , Kyoto, Japan)-
dc.titleMonocular Event Visual Inertial Odometry based on Event-corner using Sliding Windows Graph-based Optimization-
dc.typeConference_Paper-
dc.identifier.doi10.1109/IROS47612.2022.9981970-
dc.identifier.scopuseid_2-s2.0-85139074653-
dc.identifier.volume2022-October-
dc.identifier.spage2438-
dc.identifier.epage2445-
dc.identifier.isiWOS:000908368202008-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats