File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: Deep Generative Learning via Schrödinger Bridge
| Title | Deep Generative Learning via Schrödinger Bridge |
|---|---|
| Authors | |
| Issue Date | 2021 |
| Citation | Proceedings of Machine Learning Research, 2021, v. 139, p. 10794-10804 How to Cite? |
| Abstract | We propose to learn a generative model via entropy interpolation with a Schrödinger Bridge. The generative learning task can be formulated as interpolating between a reference distribution and a target distribution based on the Kullback-Leibler divergence. At the population level, this entropy interpolation is characterized via an SDE on [0, 1] with a time-varying drift term. At the sample level, we derive our Schrödinger Bridge algorithm by plugging the drift term estimated by a deep score estimator and a deep density ratio estimator into the Euler-Maruyama method. Under some mild smoothness assumptions of the target distribution, we prove the consistency of both the score estimator and the density ratio estimator, and then establish the consistency of the proposed Schrödinger Bridge approach. Our theoretical results guarantee that the distribution learned by our approach converges to the target distribution. Experimental results on multimodal synthetic data and benchmark data support our theoretical findings and indicate that the generative model via Schrödinger Bridge is comparable with state-of-the-art GANs, suggesting a new formulation of generative learning. We demonstrate its usefulness in image interpolation and image inpainting. |
| Persistent Identifier | http://hdl.handle.net/10722/363546 |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Wang, Gefei | - |
| dc.contributor.author | Jiao, Yuling | - |
| dc.contributor.author | Xu, Qian | - |
| dc.contributor.author | Wang, Yang | - |
| dc.contributor.author | Yang, Can | - |
| dc.date.accessioned | 2025-10-10T07:47:40Z | - |
| dc.date.available | 2025-10-10T07:47:40Z | - |
| dc.date.issued | 2021 | - |
| dc.identifier.citation | Proceedings of Machine Learning Research, 2021, v. 139, p. 10794-10804 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/363546 | - |
| dc.description.abstract | We propose to learn a generative model via entropy interpolation with a Schrödinger Bridge. The generative learning task can be formulated as interpolating between a reference distribution and a target distribution based on the Kullback-Leibler divergence. At the population level, this entropy interpolation is characterized via an SDE on [0, 1] with a time-varying drift term. At the sample level, we derive our Schrödinger Bridge algorithm by plugging the drift term estimated by a deep score estimator and a deep density ratio estimator into the Euler-Maruyama method. Under some mild smoothness assumptions of the target distribution, we prove the consistency of both the score estimator and the density ratio estimator, and then establish the consistency of the proposed Schrödinger Bridge approach. Our theoretical results guarantee that the distribution learned by our approach converges to the target distribution. Experimental results on multimodal synthetic data and benchmark data support our theoretical findings and indicate that the generative model via Schrödinger Bridge is comparable with state-of-the-art GANs, suggesting a new formulation of generative learning. We demonstrate its usefulness in image interpolation and image inpainting. | - |
| dc.language | eng | - |
| dc.relation.ispartof | Proceedings of Machine Learning Research | - |
| dc.title | Deep Generative Learning via Schrödinger Bridge | - |
| dc.type | Conference_Paper | - |
| dc.description.nature | link_to_subscribed_fulltext | - |
| dc.identifier.scopus | eid_2-s2.0-85161347918 | - |
| dc.identifier.volume | 139 | - |
| dc.identifier.spage | 10794 | - |
| dc.identifier.epage | 10804 | - |
| dc.identifier.eissn | 2640-3498 | - |
