File Download
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: Efficient learning for undirected topic models
Title | Efficient learning for undirected topic models |
---|---|
Authors | |
Issue Date | 2015 |
Publisher | Association for Computational Linguistics (ACL). |
Citation | The 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing (ACL-IJCNLP 2015), Beijing, China, 26-31 July 2015. In Conference Proceedings, 2015, v. 2, p. 162-167 How to Cite? |
Abstract | Replicated Softmax model, a well-known undirected topic model, is powerful in ex-tracting semantic representations of docu-ments. Traditional learning strategies such as Contrastive Divergence are very inef-ficient. This paper provides a novel esti-mator to speed up the learning based on Noise Contrastive Estimate, extended for documents of variant lengths and weighted inputs. Experiments on two benchmarks show that the new estimator achieves great learning efficiency and high accuracy on document retrieval and classification. © 2015 Association for Computational Linguistics. |
Persistent Identifier | http://hdl.handle.net/10722/232307 |
ISBN |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Gu, J | - |
dc.contributor.author | Li, VOK | - |
dc.date.accessioned | 2016-09-20T05:29:06Z | - |
dc.date.available | 2016-09-20T05:29:06Z | - |
dc.date.issued | 2015 | - |
dc.identifier.citation | The 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing (ACL-IJCNLP 2015), Beijing, China, 26-31 July 2015. In Conference Proceedings, 2015, v. 2, p. 162-167 | - |
dc.identifier.isbn | 978-194164373-0 | - |
dc.identifier.uri | http://hdl.handle.net/10722/232307 | - |
dc.description.abstract | Replicated Softmax model, a well-known undirected topic model, is powerful in ex-tracting semantic representations of docu-ments. Traditional learning strategies such as Contrastive Divergence are very inef-ficient. This paper provides a novel esti-mator to speed up the learning based on Noise Contrastive Estimate, extended for documents of variant lengths and weighted inputs. Experiments on two benchmarks show that the new estimator achieves great learning efficiency and high accuracy on document retrieval and classification. © 2015 Association for Computational Linguistics. | - |
dc.language | eng | - |
dc.publisher | Association for Computational Linguistics (ACL). | - |
dc.relation.ispartof | Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Short Papers) | - |
dc.title | Efficient learning for undirected topic models | - |
dc.type | Conference_Paper | - |
dc.identifier.email | Li, VOK: vli@eee.hku.hk | - |
dc.identifier.authority | Li, VOK=rp00150 | - |
dc.description.nature | postprint | - |
dc.identifier.scopus | eid_2-s2.0-84944064715 | - |
dc.identifier.hkuros | 265297 | - |
dc.identifier.volume | 2 | - |
dc.identifier.spage | 162 | - |
dc.identifier.epage | 167 | - |
dc.publisher.place | United States | - |
dc.customcontrol.immutable | sml 161117 | - |