File Download
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.18653/v1/P17-1176
- Scopus: eid_2-s2.0-85040945047
- WOS: WOS:000493984800176
Supplementary
- Citations:
- Appears in Collections:
Conference Paper: A teacher-student framework for zero-resource neural machine translation
Title | A teacher-student framework for zero-resource neural machine translation |
---|---|
Authors | |
Issue Date | 2017 |
Publisher | Association for Computational Linguistics. The Proceedings' web site is located at http://aclweb.org/anthology/D/D17/#1000 |
Citation | Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (ACL), Vancouver, Canada, 30 July - 4 August 2017, v. 1: Long Papers, p. 1925-1935 How to Cite? |
Abstract | While end-to-end neural machine translation (NMT) has made remarkable progress recently, it still suffers from the data scarcity problem for low-resource language pairs and domains. In this paper, we propose a method for zero-resource NMT by assuming that parallel sentences have close probabilities of generating a sentence in a third language. Based on the assumption, our method is able to train a source-to-target NMT model (“student”) without parallel corpora available guided by an existing pivot-to-target NMT model (“teacher”) on a source-pivot parallel corpus. Experimental results show that the proposed method significantly improves over a baseline pivot-based model by +3.0 BLEU points across various language pairs. |
Persistent Identifier | http://hdl.handle.net/10722/262432 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Chen, Y | - |
dc.contributor.author | Liu, Y | - |
dc.contributor.author | Cheng, Y | - |
dc.contributor.author | Li, VOK | - |
dc.date.accessioned | 2018-09-28T04:59:14Z | - |
dc.date.available | 2018-09-28T04:59:14Z | - |
dc.date.issued | 2017 | - |
dc.identifier.citation | Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (ACL), Vancouver, Canada, 30 July - 4 August 2017, v. 1: Long Papers, p. 1925-1935 | - |
dc.identifier.uri | http://hdl.handle.net/10722/262432 | - |
dc.description.abstract | While end-to-end neural machine translation (NMT) has made remarkable progress recently, it still suffers from the data scarcity problem for low-resource language pairs and domains. In this paper, we propose a method for zero-resource NMT by assuming that parallel sentences have close probabilities of generating a sentence in a third language. Based on the assumption, our method is able to train a source-to-target NMT model (“student”) without parallel corpora available guided by an existing pivot-to-target NMT model (“teacher”) on a source-pivot parallel corpus. Experimental results show that the proposed method significantly improves over a baseline pivot-based model by +3.0 BLEU points across various language pairs. | - |
dc.language | eng | - |
dc.publisher | Association for Computational Linguistics. The Proceedings' web site is located at http://aclweb.org/anthology/D/D17/#1000 | - |
dc.relation.ispartof | Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (ACL) | - |
dc.rights | This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. | - |
dc.title | A teacher-student framework for zero-resource neural machine translation | - |
dc.type | Conference_Paper | - |
dc.identifier.email | Li, VOK: vli@eee.hku.hk | - |
dc.identifier.authority | Li, VOK=rp00150 | - |
dc.description.nature | published_or_final_version | - |
dc.identifier.doi | 10.18653/v1/P17-1176 | - |
dc.identifier.scopus | eid_2-s2.0-85040945047 | - |
dc.identifier.hkuros | 292194 | - |
dc.identifier.volume | 1: Long Papers | - |
dc.identifier.spage | 1925 | - |
dc.identifier.epage | 1935 | - |
dc.identifier.isi | WOS:000493984800176 | - |