File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Learning Neural Acoustic Fields

TitleLearning Neural Acoustic Fields
Authors
Issue Date2022
Citation
Advances in Neural Information Processing Systems, 2022, v. 35 How to Cite?
AbstractOur environment is filled with rich and dynamic acoustic information. When we walk into a cathedral, the reverberations as much as appearance inform us of the sanctuary's wide open space. Similarly, as an object moves around us, we expect the sound emitted to also exhibit this movement. While recent advances in learned implicit functions have led to increasingly higher quality representations of the visual world, there have not been commensurate advances in learning spatial auditory representations. To address this gap, we introduce Neural Acoustic Fields (NAFs), an implicit representation that captures how sounds propagate in a physical scene. By modeling acoustic propagation in a scene as a linear time-invariant system, NAFs learn to continuously map all emitter and listener location pairs to a neural impulse response function that can then be applied to arbitrary sounds. We demonstrate NAFs on both synthetic and real data, and show that the continuous nature of NAFs enables us to render spatial acoustics for a listener at arbitrary locations. We further show that the representation learned by NAFs can help improve visual learning with sparse views. Finally we show that a representation informative of scene structure emerges during the learning of NAFs. Project site: https://www.andrew.cmu.edu/user/afluo/Neural_Acoustic_Fields.
Persistent Identifierhttp://hdl.handle.net/10722/352358
ISSN
2020 SCImago Journal Rankings: 1.399

 

DC FieldValueLanguage
dc.contributor.authorLuo, Andrew-
dc.contributor.authorTarr, Michael J.-
dc.contributor.authorTorralba, Antonio-
dc.contributor.authorDu, Yilun-
dc.contributor.authorTenenbaum, Joshua B.-
dc.contributor.authorGan, Chuang-
dc.date.accessioned2024-12-16T03:58:27Z-
dc.date.available2024-12-16T03:58:27Z-
dc.date.issued2022-
dc.identifier.citationAdvances in Neural Information Processing Systems, 2022, v. 35-
dc.identifier.issn1049-5258-
dc.identifier.urihttp://hdl.handle.net/10722/352358-
dc.description.abstractOur environment is filled with rich and dynamic acoustic information. When we walk into a cathedral, the reverberations as much as appearance inform us of the sanctuary's wide open space. Similarly, as an object moves around us, we expect the sound emitted to also exhibit this movement. While recent advances in learned implicit functions have led to increasingly higher quality representations of the visual world, there have not been commensurate advances in learning spatial auditory representations. To address this gap, we introduce Neural Acoustic Fields (NAFs), an implicit representation that captures how sounds propagate in a physical scene. By modeling acoustic propagation in a scene as a linear time-invariant system, NAFs learn to continuously map all emitter and listener location pairs to a neural impulse response function that can then be applied to arbitrary sounds. We demonstrate NAFs on both synthetic and real data, and show that the continuous nature of NAFs enables us to render spatial acoustics for a listener at arbitrary locations. We further show that the representation learned by NAFs can help improve visual learning with sparse views. Finally we show that a representation informative of scene structure emerges during the learning of NAFs. Project site: https://www.andrew.cmu.edu/user/afluo/Neural_Acoustic_Fields.-
dc.languageeng-
dc.relation.ispartofAdvances in Neural Information Processing Systems-
dc.titleLearning Neural Acoustic Fields-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.scopuseid_2-s2.0-85160137010-
dc.identifier.volume35-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats