File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/IROS51168.2021.9635834
- Scopus: eid_2-s2.0-85124371985
- Find via
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: Accurate depth estimation from a hybrid event-RGB stereo setup
Title | Accurate depth estimation from a hybrid event-RGB stereo setup |
---|---|
Authors | |
Issue Date | 2021 |
Citation | IEEE International Conference on Intelligent Robots and Systems, 2021, p. 6833-6840 How to Cite? |
Abstract | Event-based visual perception is becoming increasingly popular owing to interesting sensor characteristics enabling the handling of difficult conditions such as highly dynamic motion or challenging illumination. The mostly complementary nature of event cameras however still means that best results are achieved if the sensor is paired with a regular frame-based sensor. The present work aims at answering a simple question: Assuming that both cameras do not share a common optical center, is it possible to exploit the hybrid stereo setup's baseline to perform accurate stereo depth estimation We present a learning based solution to this problem leveraging modern spatio-temporal input representations as well as a novel hybrid pyramid attention module. Results on real data demonstrate competitive performance against pure frame-based stereo alternatives as well as the ability to maintain the advantageous properties of event-based sensors. |
Persistent Identifier | http://hdl.handle.net/10722/345167 |
ISSN | 2023 SCImago Journal Rankings: 1.094 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Zuo, Yi Fan | - |
dc.contributor.author | Cui, Li | - |
dc.contributor.author | Peng, Xin | - |
dc.contributor.author | Xu, Yanyu | - |
dc.contributor.author | Gao, Shenghua | - |
dc.contributor.author | Wang, Xia | - |
dc.contributor.author | Kneip, Laurent | - |
dc.date.accessioned | 2024-08-15T09:25:39Z | - |
dc.date.available | 2024-08-15T09:25:39Z | - |
dc.date.issued | 2021 | - |
dc.identifier.citation | IEEE International Conference on Intelligent Robots and Systems, 2021, p. 6833-6840 | - |
dc.identifier.issn | 2153-0858 | - |
dc.identifier.uri | http://hdl.handle.net/10722/345167 | - |
dc.description.abstract | Event-based visual perception is becoming increasingly popular owing to interesting sensor characteristics enabling the handling of difficult conditions such as highly dynamic motion or challenging illumination. The mostly complementary nature of event cameras however still means that best results are achieved if the sensor is paired with a regular frame-based sensor. The present work aims at answering a simple question: Assuming that both cameras do not share a common optical center, is it possible to exploit the hybrid stereo setup's baseline to perform accurate stereo depth estimation We present a learning based solution to this problem leveraging modern spatio-temporal input representations as well as a novel hybrid pyramid attention module. Results on real data demonstrate competitive performance against pure frame-based stereo alternatives as well as the ability to maintain the advantageous properties of event-based sensors. | - |
dc.language | eng | - |
dc.relation.ispartof | IEEE International Conference on Intelligent Robots and Systems | - |
dc.title | Accurate depth estimation from a hybrid event-RGB stereo setup | - |
dc.type | Conference_Paper | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1109/IROS51168.2021.9635834 | - |
dc.identifier.scopus | eid_2-s2.0-85124371985 | - |
dc.identifier.spage | 6833 | - |
dc.identifier.epage | 6840 | - |
dc.identifier.eissn | 2153-0866 | - |