File Download
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1145/3593587
- Scopus: eid_2-s2.0-85181673552
- Find via

Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Article: A Reconfigurable Architecture for Real-time Event-based Multi-Object Tracking
| Title | A Reconfigurable Architecture for Real-time Event-based Multi-Object Tracking |
|---|---|
| Authors | |
| Keywords | attention unit Dynamic Vision Sensors event camera event sensors FPGA hardware/software co-design HOTA multi-object tracking REMOT |
| Issue Date | 1-Sep-2023 |
| Publisher | Association for Computing Machinery (ACM) |
| Citation | ACM Transactions on Reconfigurable Technology and Systems, 2023, v. 16, n. 4, p. 1-26 How to Cite? |
| Abstract | Although advances in event-based machine vision algorithms have demonstrated unparalleled capabilities in performing some of the most demanding tasks, their implementations under stringent real-time and power constraints in edge systems remain a major challenge. In this work, a reconfigurable hardware-software architecture called REMOT, which performs real-time event-based multi-object tracking on FPGAs, is presented. REMOT performs vision tasks by defining a set of actions over attention units (AUs). These actions allow AUs to track an object candidate autonomously by adjusting its region of attention and allow information gathered by each AU to be used for making algorithmic-level decisions. Taking advantage of this modular structure, algorithm-architecture codesign can be performed by implementing different parts of the algorithm in either hardware or software for different tradeoffs. Results show that REMOT can process 0.43-2.91 million events per second at 1.75-5.45 W. Compared with the software baseline, our implementation achieves up to 44 times higher throughput and 35.4 times higher power efficiency. Migrating the Merge operation to hardware further reduces the worst-case latency to be 95 times shorter than the software baseline. By varying the AU configuration and operation, a reduction of 0.59-0.77 mW per AU on the programmable logic has also been demonstrated. |
| Persistent Identifier | http://hdl.handle.net/10722/368596 |
| ISSN | 2023 Impact Factor: 3.1 2023 SCImago Journal Rankings: 0.802 |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Gao, Yizhao | - |
| dc.contributor.author | Wang, Song | - |
| dc.contributor.author | So, Hayden Kwok Hay | - |
| dc.date.accessioned | 2026-01-15T00:35:27Z | - |
| dc.date.available | 2026-01-15T00:35:27Z | - |
| dc.date.issued | 2023-09-01 | - |
| dc.identifier.citation | ACM Transactions on Reconfigurable Technology and Systems, 2023, v. 16, n. 4, p. 1-26 | - |
| dc.identifier.issn | 1936-7406 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/368596 | - |
| dc.description.abstract | Although advances in event-based machine vision algorithms have demonstrated unparalleled capabilities in performing some of the most demanding tasks, their implementations under stringent real-time and power constraints in edge systems remain a major challenge. In this work, a reconfigurable hardware-software architecture called REMOT, which performs real-time event-based multi-object tracking on FPGAs, is presented. REMOT performs vision tasks by defining a set of actions over attention units (AUs). These actions allow AUs to track an object candidate autonomously by adjusting its region of attention and allow information gathered by each AU to be used for making algorithmic-level decisions. Taking advantage of this modular structure, algorithm-architecture codesign can be performed by implementing different parts of the algorithm in either hardware or software for different tradeoffs. Results show that REMOT can process 0.43-2.91 million events per second at 1.75-5.45 W. Compared with the software baseline, our implementation achieves up to 44 times higher throughput and 35.4 times higher power efficiency. Migrating the Merge operation to hardware further reduces the worst-case latency to be 95 times shorter than the software baseline. By varying the AU configuration and operation, a reduction of 0.59-0.77 mW per AU on the programmable logic has also been demonstrated. | - |
| dc.language | eng | - |
| dc.publisher | Association for Computing Machinery (ACM) | - |
| dc.relation.ispartof | ACM Transactions on Reconfigurable Technology and Systems | - |
| dc.rights | This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. | - |
| dc.subject | attention unit | - |
| dc.subject | Dynamic Vision Sensors | - |
| dc.subject | event camera | - |
| dc.subject | event sensors | - |
| dc.subject | FPGA | - |
| dc.subject | hardware/software co-design | - |
| dc.subject | HOTA | - |
| dc.subject | multi-object tracking | - |
| dc.subject | REMOT | - |
| dc.title | A Reconfigurable Architecture for Real-time Event-based Multi-Object Tracking | - |
| dc.type | Article | - |
| dc.description.nature | published_or_final_version | - |
| dc.identifier.doi | 10.1145/3593587 | - |
| dc.identifier.scopus | eid_2-s2.0-85181673552 | - |
| dc.identifier.volume | 16 | - |
| dc.identifier.issue | 4 | - |
| dc.identifier.spage | 1 | - |
| dc.identifier.epage | 26 | - |
| dc.identifier.eissn | 1936-7414 | - |
| dc.identifier.issnl | 1936-7406 | - |
