File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/TPAMI.2020.3020800
- Scopus: eid_2-s2.0-85122780857
- PMID: 32870785
- WOS: WOS:000740006100031
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: GeoNet++: Iterative Geometric Neural Network with Edge-Aware Refinement for Joint Depth and Surface Normal Estimation
Title | GeoNet++: Iterative Geometric Neural Network with Edge-Aware Refinement for Joint Depth and Surface Normal Estimation |
---|---|
Authors | |
Keywords | Three-dimensional displays Surface reconstruction Estimation Image reconstruction Computer architecture |
Issue Date | 2020 |
Publisher | IEEE. The Journal's web site is located at http://www.computer.org/tpami |
Citation | IEEE Transactions on Pattern Analysis and Machine Intelligence, 2020, Epub 2020-09-01, p. 1-1 How to Cite? |
Abstract | In this paper, we propose a geometric neural network with edge-aware refinement (GeoNet++) to jointly predict both depth and surface normal maps from a single image. Building on top of two-stream CNNs, GeoNet++ captures the geometric relationships between depth and surface normals with the proposed depth-to-normal and normal-to-depth modules. In particular, the “depth-to-normal” module exploits the least square solution of estimating surface normals from depth to improve their quality, while the “normal-to-depth” module refines the depth map based on the constraints on surface normals through kernel regression. Boundary information is exploited via an edge-aware refinement module. GeoNet++ effectively predicts depth and surface normals with high 3D consistency and sharp boundaries resulting in better reconstructed 3D scenes. Note that GeoNet++ is generic and can be used in other depth/normal prediction frameworks to improve 3D reconstruction quality and pixel-wise accuracy of depth and surface normals. Furthermore, we propose a new 3D geometric metric (3DGM) for evaluating depth prediction in 3D. In contrast to current metrics that focus on evaluating pixel-wise error/accuracy, 3DGM measures whether the predicted depth can reconstruct high-quality 3D surface normals. This is a more natural metric for many 3D application domains. Our experiments on NYUD-V2 and KITTI demonstrate the effectiveness of our approach. |
Persistent Identifier | http://hdl.handle.net/10722/287658 |
ISSN | 2023 Impact Factor: 20.8 2023 SCImago Journal Rankings: 6.158 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Qi, X | - |
dc.contributor.author | Liu, Z | - |
dc.contributor.author | Liao, R | - |
dc.contributor.author | Torr, P | - |
dc.contributor.author | Urtasun, R | - |
dc.contributor.author | Jia, J | - |
dc.date.accessioned | 2020-10-05T12:01:20Z | - |
dc.date.available | 2020-10-05T12:01:20Z | - |
dc.date.issued | 2020 | - |
dc.identifier.citation | IEEE Transactions on Pattern Analysis and Machine Intelligence, 2020, Epub 2020-09-01, p. 1-1 | - |
dc.identifier.issn | 0162-8828 | - |
dc.identifier.uri | http://hdl.handle.net/10722/287658 | - |
dc.description.abstract | In this paper, we propose a geometric neural network with edge-aware refinement (GeoNet++) to jointly predict both depth and surface normal maps from a single image. Building on top of two-stream CNNs, GeoNet++ captures the geometric relationships between depth and surface normals with the proposed depth-to-normal and normal-to-depth modules. In particular, the “depth-to-normal” module exploits the least square solution of estimating surface normals from depth to improve their quality, while the “normal-to-depth” module refines the depth map based on the constraints on surface normals through kernel regression. Boundary information is exploited via an edge-aware refinement module. GeoNet++ effectively predicts depth and surface normals with high 3D consistency and sharp boundaries resulting in better reconstructed 3D scenes. Note that GeoNet++ is generic and can be used in other depth/normal prediction frameworks to improve 3D reconstruction quality and pixel-wise accuracy of depth and surface normals. Furthermore, we propose a new 3D geometric metric (3DGM) for evaluating depth prediction in 3D. In contrast to current metrics that focus on evaluating pixel-wise error/accuracy, 3DGM measures whether the predicted depth can reconstruct high-quality 3D surface normals. This is a more natural metric for many 3D application domains. Our experiments on NYUD-V2 and KITTI demonstrate the effectiveness of our approach. | - |
dc.language | eng | - |
dc.publisher | IEEE. The Journal's web site is located at http://www.computer.org/tpami | - |
dc.relation.ispartof | IEEE Transactions on Pattern Analysis and Machine Intelligence | - |
dc.rights | IEEE Transactions on Pattern Analysis and Machine Intelligence. Copyright © IEEE. | - |
dc.rights | ©20xx IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. | - |
dc.subject | Three-dimensional displays | - |
dc.subject | Surface reconstruction | - |
dc.subject | Estimation | - |
dc.subject | Image reconstruction | - |
dc.subject | Computer architecture | - |
dc.title | GeoNet++: Iterative Geometric Neural Network with Edge-Aware Refinement for Joint Depth and Surface Normal Estimation | - |
dc.type | Article | - |
dc.identifier.email | Qi, X: xjqi@eee.hku.hk | - |
dc.identifier.authority | Qi, X=rp02666 | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1109/TPAMI.2020.3020800 | - |
dc.identifier.pmid | 32870785 | - |
dc.identifier.scopus | eid_2-s2.0-85122780857 | - |
dc.identifier.hkuros | 315446 | - |
dc.identifier.volume | Epub 2020-09-01 | - |
dc.identifier.spage | 1 | - |
dc.identifier.epage | 1 | - |
dc.identifier.isi | WOS:000740006100031 | - |
dc.publisher.place | United States | - |
dc.identifier.issnl | 0162-8828 | - |