File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Correspondence Distillation from NeRF-Based GAN

TitleCorrespondence Distillation from NeRF-Based GAN
Authors
KeywordsComputer graphics
Computer vision
Dense correspondence
Generative modeling
Neural radiance field
Shape analysis
Issue Date2024
Citation
International Journal of Computer Vision, 2024, v. 132, n. 3, p. 611-631 How to Cite?
AbstractThe neural radiance field (NeRF) has shown promising results in preserving the fine details of objects and scenes. However, unlike explicit shape representations e.g., mesh, it remains an open problem to build dense correspondences across different NeRFs of the same category, which is essential in many downstream tasks. The main difficulties of this problem lie in the implicit nature of NeRF and the lack of ground-truth correspondence annotations. In this paper, we show it is possible to bypass these challenges by leveraging the rich semantics and structural priors encapsulated in a pre-trained NeRF-based GAN. Specifically, we exploit such priors from three aspects, namely (1) a dual deformation field that takes latent codes as global structural indicators, (2) a learning objective that regards generator features as geometric-aware local descriptors, and (3) a source of infinite object-specific NeRF samples. Our experiments demonstrate that such priors lead to 3D dense correspondence that is accurate, smooth, and robust. We also show that established dense correspondence across NeRFs can effectively enable many NeRF-based downstream applications such as texture transfer.
Persistent Identifierhttp://hdl.handle.net/10722/352384
ISSN
2023 Impact Factor: 11.6
2023 SCImago Journal Rankings: 6.668

 

DC FieldValueLanguage
dc.contributor.authorLan, Yushi-
dc.contributor.authorLoy, Chen Change-
dc.contributor.authorDai, Bo-
dc.date.accessioned2024-12-16T03:58:36Z-
dc.date.available2024-12-16T03:58:36Z-
dc.date.issued2024-
dc.identifier.citationInternational Journal of Computer Vision, 2024, v. 132, n. 3, p. 611-631-
dc.identifier.issn0920-5691-
dc.identifier.urihttp://hdl.handle.net/10722/352384-
dc.description.abstractThe neural radiance field (NeRF) has shown promising results in preserving the fine details of objects and scenes. However, unlike explicit shape representations e.g., mesh, it remains an open problem to build dense correspondences across different NeRFs of the same category, which is essential in many downstream tasks. The main difficulties of this problem lie in the implicit nature of NeRF and the lack of ground-truth correspondence annotations. In this paper, we show it is possible to bypass these challenges by leveraging the rich semantics and structural priors encapsulated in a pre-trained NeRF-based GAN. Specifically, we exploit such priors from three aspects, namely (1) a dual deformation field that takes latent codes as global structural indicators, (2) a learning objective that regards generator features as geometric-aware local descriptors, and (3) a source of infinite object-specific NeRF samples. Our experiments demonstrate that such priors lead to 3D dense correspondence that is accurate, smooth, and robust. We also show that established dense correspondence across NeRFs can effectively enable many NeRF-based downstream applications such as texture transfer.-
dc.languageeng-
dc.relation.ispartofInternational Journal of Computer Vision-
dc.subjectComputer graphics-
dc.subjectComputer vision-
dc.subjectDense correspondence-
dc.subjectGenerative modeling-
dc.subjectNeural radiance field-
dc.subjectShape analysis-
dc.titleCorrespondence Distillation from NeRF-Based GAN-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1007/s11263-023-01903-w-
dc.identifier.scopuseid_2-s2.0-85172194064-
dc.identifier.volume132-
dc.identifier.issue3-
dc.identifier.spage611-
dc.identifier.epage631-
dc.identifier.eissn1573-1405-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats