File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/TIV.2023.3339144
- Scopus: eid_2-s2.0-85179833830
- Find via
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Article: ECMD: An Event-Centric Multisensory Driving Dataset for SLAM
Title | ECMD: An Event-Centric Multisensory Driving Dataset for SLAM |
---|---|
Authors | |
Keywords | Autonomous Driving Cameras Dataset Event-based Vision Laser radar Multi-sensor Fusion Robots Sensor systems Sensors Simultaneous localization and mapping SLAM Visualization |
Issue Date | 5-Dec-2023 |
Publisher | Institute of Electrical and Electronics Engineers |
Citation | IEEE Transactions on Intelligent Vehicles, 2023 How to Cite? |
Abstract | Leveraging multiple sensors enhances complex environmental perception and increases resilience to varying luminance conditions and high-speed motion patterns, achieving precise localization and mapping. This paper proposes, ECMD, an event-centric multisensory dataset containing 81 sequences and covering over 200 km of various challenging driving scenarios including high-speed motion, repetitive scenarios, dynamic objects, etc. ECMD provides data from two sets of stereo event cameras with different resolutions (640×480, 346×260), stereo industrial cameras, an infrared camera, a top-installed mechanical LiDAR with two slanted LiDARs, two consumer-level GNSS receivers, and an onboard IMU. Meanwhile, the ground-truth of the vehicle was obtained using a centimeter-level high-accuracy GNSS-RTK/INS navigation system. All sensors are well-calibrated and temporally synchronized at the hardware level, with recording data simultaneously. We additionally evaluate several state-of-the-art SLAM algorithms for benchmarking visual and LiDAR SLAM and identifying their limitations. The dataset is available at https://arclab-hku.github.io/ecmd/ . |
Persistent Identifier | http://hdl.handle.net/10722/339581 |
ISSN | 2023 Impact Factor: 14.0 2023 SCImago Journal Rankings: 2.469 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Chen, Peiyu | - |
dc.contributor.author | Guan, Weipeng | - |
dc.contributor.author | Huang, Feng | - |
dc.contributor.author | Zhong, Yihan | - |
dc.contributor.author | Wen, Weisong | - |
dc.contributor.author | Hsu, Li-Ta | - |
dc.contributor.author | Lu, Peng | - |
dc.date.accessioned | 2024-03-11T10:37:47Z | - |
dc.date.available | 2024-03-11T10:37:47Z | - |
dc.date.issued | 2023-12-05 | - |
dc.identifier.citation | IEEE Transactions on Intelligent Vehicles, 2023 | - |
dc.identifier.issn | 2379-8858 | - |
dc.identifier.uri | http://hdl.handle.net/10722/339581 | - |
dc.description.abstract | <p>Leveraging multiple sensors enhances complex environmental perception and increases resilience to varying luminance conditions and high-speed motion patterns, achieving precise localization and mapping. This paper proposes, ECMD, an event-centric multisensory dataset containing 81 sequences and covering over 200 km of various challenging driving scenarios including high-speed motion, repetitive scenarios, dynamic objects, etc. ECMD provides data from two sets of stereo event cameras with different resolutions (640×480, 346×260), stereo industrial cameras, an infrared camera, a top-installed mechanical LiDAR with two slanted LiDARs, two consumer-level GNSS receivers, and an onboard IMU. Meanwhile, the ground-truth of the vehicle was obtained using a centimeter-level high-accuracy GNSS-RTK/INS navigation system. All sensors are well-calibrated and temporally synchronized at the hardware level, with recording data simultaneously. We additionally evaluate several state-of-the-art SLAM algorithms for benchmarking visual and LiDAR SLAM and identifying their limitations. The dataset is available at https://arclab-hku.github.io/ecmd/ .</p> | - |
dc.language | eng | - |
dc.publisher | Institute of Electrical and Electronics Engineers | - |
dc.relation.ispartof | IEEE Transactions on Intelligent Vehicles | - |
dc.subject | Autonomous Driving | - |
dc.subject | Cameras | - |
dc.subject | Dataset | - |
dc.subject | Event-based Vision | - |
dc.subject | Laser radar | - |
dc.subject | Multi-sensor Fusion | - |
dc.subject | Robots | - |
dc.subject | Sensor systems | - |
dc.subject | Sensors | - |
dc.subject | Simultaneous localization and mapping | - |
dc.subject | SLAM | - |
dc.subject | Visualization | - |
dc.title | ECMD: An Event-Centric Multisensory Driving Dataset for SLAM | - |
dc.type | Article | - |
dc.identifier.doi | 10.1109/TIV.2023.3339144 | - |
dc.identifier.scopus | eid_2-s2.0-85179833830 | - |
dc.identifier.eissn | 2379-8904 | - |
dc.identifier.issnl | 2379-8858 | - |