File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/TPAMI.2023.3333838
- Scopus: eid_2-s2.0-85178071901
- PMID: 37976193
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Delving into the Devils of Bird's-Eye-View Perception: A Review, Evaluation and Recipe
Title | Delving into the Devils of Bird's-Eye-View Perception: A Review, Evaluation and Recipe |
---|---|
Authors | |
Keywords | 3D detection and segmentation autonomous driving challenge birds-eye-view (BEV) perception |
Issue Date | 2024 |
Citation | IEEE Transactions on Pattern Analysis and Machine Intelligence, 2024, v. 46, n. 4, p. 2151-2170 How to Cite? |
Abstract | Learning powerful representations in bird's-eye-view (BEV) for perception tasks is trending and drawing extensive attention both from industry and academia. Conventional approaches for most autonomous driving algorithms perform detection, segmentation, tracking, etc., in a front or perspective view. As sensor configurations get more complex, integrating multi-source information from different sensors and representing features in a unified view come of vital importance. BEV perception inherits several advantages, as representing surrounding scenes in BEV is intuitive and fusion-friendly; and representing objects in BEV is most desirable for subsequent modules as in planning and/or control. The core problems for BEV perception lie in (a) how to reconstruct the lost 3D information via view transformation from perspective view to BEV; (b) how to acquire ground truth annotations in BEV grid; (c) how to formulate the pipeline to incorporate features from different sources and views; and (d) how to adapt and generalize algorithms as sensor configurations vary across different scenarios. In this survey, we review the most recent works on BEV perception and provide an in-depth analysis of different solutions. Moreover, several systematic designs of BEV approach from the industry are depicted as well. Furthermore, we introduce a full suite of practical guidebook to improve the performance of BEV perception tasks, including camera, LiDAR and fusion inputs. At last, we point out the future research directions in this area. We hope this report will shed some light on the community and encourage more research effort on BEV perception. |
Persistent Identifier | http://hdl.handle.net/10722/351485 |
ISSN | 2023 Impact Factor: 20.8 2023 SCImago Journal Rankings: 6.158 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Li, Hongyang | - |
dc.contributor.author | Sima, Chonghao | - |
dc.contributor.author | Dai, Jifeng | - |
dc.contributor.author | Wang, Wenhai | - |
dc.contributor.author | Lu, Lewei | - |
dc.contributor.author | Wang, Huijie | - |
dc.contributor.author | Zeng, Jia | - |
dc.contributor.author | Li, Zhiqi | - |
dc.contributor.author | Yang, Jiazhi | - |
dc.contributor.author | Deng, Hanming | - |
dc.contributor.author | Tian, Hao | - |
dc.contributor.author | Xie, Enze | - |
dc.contributor.author | Xie, Jiangwei | - |
dc.contributor.author | Chen, Li | - |
dc.contributor.author | Li, Tianyu | - |
dc.contributor.author | Li, Yang | - |
dc.contributor.author | Gao, Yulu | - |
dc.contributor.author | Jia, Xiaosong | - |
dc.contributor.author | Liu, Si | - |
dc.contributor.author | Shi, Jianping | - |
dc.contributor.author | Lin, Dahua | - |
dc.contributor.author | Qiao, Yu | - |
dc.date.accessioned | 2024-11-20T03:56:38Z | - |
dc.date.available | 2024-11-20T03:56:38Z | - |
dc.date.issued | 2024 | - |
dc.identifier.citation | IEEE Transactions on Pattern Analysis and Machine Intelligence, 2024, v. 46, n. 4, p. 2151-2170 | - |
dc.identifier.issn | 0162-8828 | - |
dc.identifier.uri | http://hdl.handle.net/10722/351485 | - |
dc.description.abstract | Learning powerful representations in bird's-eye-view (BEV) for perception tasks is trending and drawing extensive attention both from industry and academia. Conventional approaches for most autonomous driving algorithms perform detection, segmentation, tracking, etc., in a front or perspective view. As sensor configurations get more complex, integrating multi-source information from different sensors and representing features in a unified view come of vital importance. BEV perception inherits several advantages, as representing surrounding scenes in BEV is intuitive and fusion-friendly; and representing objects in BEV is most desirable for subsequent modules as in planning and/or control. The core problems for BEV perception lie in (a) how to reconstruct the lost 3D information via view transformation from perspective view to BEV; (b) how to acquire ground truth annotations in BEV grid; (c) how to formulate the pipeline to incorporate features from different sources and views; and (d) how to adapt and generalize algorithms as sensor configurations vary across different scenarios. In this survey, we review the most recent works on BEV perception and provide an in-depth analysis of different solutions. Moreover, several systematic designs of BEV approach from the industry are depicted as well. Furthermore, we introduce a full suite of practical guidebook to improve the performance of BEV perception tasks, including camera, LiDAR and fusion inputs. At last, we point out the future research directions in this area. We hope this report will shed some light on the community and encourage more research effort on BEV perception. | - |
dc.language | eng | - |
dc.relation.ispartof | IEEE Transactions on Pattern Analysis and Machine Intelligence | - |
dc.subject | 3D detection and segmentation | - |
dc.subject | autonomous driving challenge | - |
dc.subject | birds-eye-view (BEV) perception | - |
dc.title | Delving into the Devils of Bird's-Eye-View Perception: A Review, Evaluation and Recipe | - |
dc.type | Article | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1109/TPAMI.2023.3333838 | - |
dc.identifier.pmid | 37976193 | - |
dc.identifier.scopus | eid_2-s2.0-85178071901 | - |
dc.identifier.volume | 46 | - |
dc.identifier.issue | 4 | - |
dc.identifier.spage | 2151 | - |
dc.identifier.epage | 2170 | - |
dc.identifier.eissn | 1939-3539 | - |