File Download
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.24963/ijcai.2019/426
- Scopus: eid_2-s2.0-85074948995
- Find via
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: Parametric Manifold Learning of Gaussian Mixture Models
Title | Parametric Manifold Learning of Gaussian Mixture Models |
---|---|
Authors | |
Issue Date | 2019 |
Publisher | International Joint Conference on Artificial Intelligence. The Proceedings' web site is located at https://www.ijcai.org/past_proceedings |
Citation | Proceeding of the 28th International Joint Conference on Artificial Intelligence (IJCAI-19), Macau, China, 10-16 August 2019, p. 3073-3079 How to Cite? |
Abstract | The Gaussian Mixture Model (GMM) is among the most widely used parametric probability distributions for representing data. However, it is complicated to analyze the relationship among GMMs since they lie on a high-dimensional manifold. Previous works either perform clustering of GMMs, which learns a limited discrete latent representation, or kernel-based embedding of GMMs, which is not interpretable due to difficulty in computing the inverse mapping. In this paper, we propose Parametric Manifold Learning of GMMs (PML-GMM), which learns a parametric mapping from a low-dimensional latent space to a high-dimensional GMM manifold. Similar to PCA, the proposed mapping is parameterized by the principal axes for the component weights, means, and covariances, which are optimized to minimize the reconstruction loss measured using Kullback-Leibler divergence (KLD). As the KLD between two GMMs is intractable, we approximate the objective function by a variational upper bound, which is optimized by an EM-style algorithm. Moreover, We derive an efficient solver by alternating optimization of subproblems and exploit Monte Carlo sampling to escape from local minima. We demonstrate the effectiveness of PML-GMM through experiments on synthetic, eye-fixation, flow cytometry, and social check-in data. |
Description | Main track |
Persistent Identifier | http://hdl.handle.net/10722/274709 |
ISSN | 2020 SCImago Journal Rankings: 0.649 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Liu, Z | - |
dc.contributor.author | Yu, L | - |
dc.contributor.author | Hsiao, JHW | - |
dc.contributor.author | Chan, AB | - |
dc.date.accessioned | 2019-09-10T02:27:05Z | - |
dc.date.available | 2019-09-10T02:27:05Z | - |
dc.date.issued | 2019 | - |
dc.identifier.citation | Proceeding of the 28th International Joint Conference on Artificial Intelligence (IJCAI-19), Macau, China, 10-16 August 2019, p. 3073-3079 | - |
dc.identifier.issn | 1045-0823 | - |
dc.identifier.uri | http://hdl.handle.net/10722/274709 | - |
dc.description | Main track | - |
dc.description.abstract | The Gaussian Mixture Model (GMM) is among the most widely used parametric probability distributions for representing data. However, it is complicated to analyze the relationship among GMMs since they lie on a high-dimensional manifold. Previous works either perform clustering of GMMs, which learns a limited discrete latent representation, or kernel-based embedding of GMMs, which is not interpretable due to difficulty in computing the inverse mapping. In this paper, we propose Parametric Manifold Learning of GMMs (PML-GMM), which learns a parametric mapping from a low-dimensional latent space to a high-dimensional GMM manifold. Similar to PCA, the proposed mapping is parameterized by the principal axes for the component weights, means, and covariances, which are optimized to minimize the reconstruction loss measured using Kullback-Leibler divergence (KLD). As the KLD between two GMMs is intractable, we approximate the objective function by a variational upper bound, which is optimized by an EM-style algorithm. Moreover, We derive an efficient solver by alternating optimization of subproblems and exploit Monte Carlo sampling to escape from local minima. We demonstrate the effectiveness of PML-GMM through experiments on synthetic, eye-fixation, flow cytometry, and social check-in data. | - |
dc.language | eng | - |
dc.publisher | International Joint Conference on Artificial Intelligence. The Proceedings' web site is located at https://www.ijcai.org/past_proceedings | - |
dc.relation.ispartof | Proceeding of the 28th International Joint Conference on Artificial Intelligence (IJCAI-19) | - |
dc.title | Parametric Manifold Learning of Gaussian Mixture Models | - |
dc.type | Conference_Paper | - |
dc.identifier.email | Hsiao, JHW: jhsiao@hku.hk | - |
dc.identifier.authority | Hsiao, JHW=rp00632 | - |
dc.description.nature | link_to_OA_fulltext | - |
dc.identifier.doi | 10.24963/ijcai.2019/426 | - |
dc.identifier.scopus | eid_2-s2.0-85074948995 | - |
dc.identifier.hkuros | 303462 | - |
dc.identifier.spage | 3073 | - |
dc.identifier.epage | 3079 | - |
dc.publisher.place | United States | - |
dc.identifier.issnl | 1045-0823 | - |