File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: SpikeMOT: Event-Based Multi-Object Tracking With Sparse Motion Features

TitleSpikeMOT: Event-Based Multi-Object Tracking With Sparse Motion Features
Authors
Keywordsevent camera
event-based MOT datasets
event-based vision
Multi-object tracking (MOT)
spiking neural networks
Issue Date1-Jan-2025
PublisherInstitute of Electrical and Electronics Engineers
Citation
IEEE Access, 2025, v. 13, p. 214-230 How to Cite?
AbstractIn comparison to conventional RGB cameras, the exceptional temporal resolution of event cameras allows them to capture rich information between frames, making them prime candidates for object tracking. Yet in practice, despite their theoretical advantages, the body of work on event-based multi-object tracking (MOT) remains in its infancy, especially in real-world environments where events from complex background and camera motion can easily obscure the true target motion. To address these limitations, we introduce SpikeMOT, an innovative event-based MOT framework employing spiking neural networks (SNNs) within a Siamese architecture. SpikeMOT extracts and associates sparse spatiotemporal features from event streams, enabling high-frequency object motion inference while preserving object identities. Additionally, a simultaneous object detector provides updated spatial information of these objects at an equivalent frame rate. To evaluate the efficacy of SpikeMOT, we present DSEC-MOT, a meticulously constructed, real-world event-based MOT benchmark. This dataset features manually corrected annotations for objects experiencing severe occlusions, frequent intersections, and out-of-view scenarios commonly encountered in real-world applications. Extensive experiments on the DSEC-MOT and the FE240hz dataset demonstrate SpikeMOT's superior tracking accuracy under demanding conditions, advancing the state-of-the-art in event-based multi-object tracking.
Persistent Identifierhttp://hdl.handle.net/10722/355847
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorWang, Song-
dc.contributor.authorWang, Zhu-
dc.contributor.authorLi, Can-
dc.contributor.authorQi, Xiaojuan-
dc.contributor.authorSo, Hayden Kwok Hay-
dc.date.accessioned2025-05-18T00:40:06Z-
dc.date.available2025-05-18T00:40:06Z-
dc.date.issued2025-01-01-
dc.identifier.citationIEEE Access, 2025, v. 13, p. 214-230-
dc.identifier.urihttp://hdl.handle.net/10722/355847-
dc.description.abstractIn comparison to conventional RGB cameras, the exceptional temporal resolution of event cameras allows them to capture rich information between frames, making them prime candidates for object tracking. Yet in practice, despite their theoretical advantages, the body of work on event-based multi-object tracking (MOT) remains in its infancy, especially in real-world environments where events from complex background and camera motion can easily obscure the true target motion. To address these limitations, we introduce SpikeMOT, an innovative event-based MOT framework employing spiking neural networks (SNNs) within a Siamese architecture. SpikeMOT extracts and associates sparse spatiotemporal features from event streams, enabling high-frequency object motion inference while preserving object identities. Additionally, a simultaneous object detector provides updated spatial information of these objects at an equivalent frame rate. To evaluate the efficacy of SpikeMOT, we present DSEC-MOT, a meticulously constructed, real-world event-based MOT benchmark. This dataset features manually corrected annotations for objects experiencing severe occlusions, frequent intersections, and out-of-view scenarios commonly encountered in real-world applications. Extensive experiments on the DSEC-MOT and the FE240hz dataset demonstrate SpikeMOT's superior tracking accuracy under demanding conditions, advancing the state-of-the-art in event-based multi-object tracking.-
dc.languageeng-
dc.publisherInstitute of Electrical and Electronics Engineers-
dc.relation.ispartofIEEE Access-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.subjectevent camera-
dc.subjectevent-based MOT datasets-
dc.subjectevent-based vision-
dc.subjectMulti-object tracking (MOT)-
dc.subjectspiking neural networks-
dc.titleSpikeMOT: Event-Based Multi-Object Tracking With Sparse Motion Features-
dc.typeArticle-
dc.identifier.doi10.1109/ACCESS.2024.3523411-
dc.identifier.scopuseid_2-s2.0-85213679915-
dc.identifier.volume13-
dc.identifier.spage214-
dc.identifier.epage230-
dc.identifier.eissn2169-3536-
dc.identifier.isiWOS:001389744500022-
dc.identifier.issnl2169-3536-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats