File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Appears in Collections:
Conference Paper: Search engine guided neural machine translation
Title | Search engine guided neural machine translation |
---|---|
Authors | |
Keywords | Machine Translation;Search Engine Non-Parametric Translation Memory |
Issue Date | 2018 |
Publisher | Association for the Advancement of Artificial Intelligence (AAAI) Press. |
Citation | Proceedings of the 32nd Association for the Advancement of Artificial Intelligence (AAAI) Conference on Artificial Intelligence, New Orleans, Louisiana, USA, 2-7 February 2018, p. 5133-5140 How to Cite? |
Abstract | In this paper, we extend an attention-based neural machine translation (NMT) model by allowing it to access an entire training set of parallel sentence pairs even after training. The proposed approach consists of two stages. In the first stage–retrieval stage–, an off-the-shelf, black-box search engine is
used to retrieve a small subset of sentence pairs from a training set given a source sentence. These pairs are further filtered based on a fuzzy matching score based on edit distance. In the second stage–translation stage–, a novel translation model, called search engine guided NMT (SEG-NMT), seamlessly
uses both the source sentence and a set of retrieved sentence pairs to perform the translation. Empirical evaluation on three language pairs (En-Fr, En-De, and En-Es) shows that the proposed approach significantly outperforms the baseline approach and the improvement is more significant when more
relevant sentence pairs were retrieved. |
Description | Session: AAAI18 - NLP and Machine Learning |
Persistent Identifier | http://hdl.handle.net/10722/262421 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Gu, J | - |
dc.contributor.author | Wang, Y | - |
dc.contributor.author | Cho, K | - |
dc.contributor.author | Li, VOK | - |
dc.date.accessioned | 2018-09-28T04:59:03Z | - |
dc.date.available | 2018-09-28T04:59:03Z | - |
dc.date.issued | 2018 | - |
dc.identifier.citation | Proceedings of the 32nd Association for the Advancement of Artificial Intelligence (AAAI) Conference on Artificial Intelligence, New Orleans, Louisiana, USA, 2-7 February 2018, p. 5133-5140 | - |
dc.identifier.uri | http://hdl.handle.net/10722/262421 | - |
dc.description | Session: AAAI18 - NLP and Machine Learning | - |
dc.description.abstract | In this paper, we extend an attention-based neural machine translation (NMT) model by allowing it to access an entire training set of parallel sentence pairs even after training. The proposed approach consists of two stages. In the first stage–retrieval stage–, an off-the-shelf, black-box search engine is used to retrieve a small subset of sentence pairs from a training set given a source sentence. These pairs are further filtered based on a fuzzy matching score based on edit distance. In the second stage–translation stage–, a novel translation model, called search engine guided NMT (SEG-NMT), seamlessly uses both the source sentence and a set of retrieved sentence pairs to perform the translation. Empirical evaluation on three language pairs (En-Fr, En-De, and En-Es) shows that the proposed approach significantly outperforms the baseline approach and the improvement is more significant when more relevant sentence pairs were retrieved. | - |
dc.language | eng | - |
dc.publisher | Association for the Advancement of Artificial Intelligence (AAAI) Press. | - |
dc.relation.ispartof | Proceedings of the 32nd Association for the Advancement of Artificial Intelligence (AAAI) Conference on Artificial Intelligence (AAAI-18) | - |
dc.subject | Machine Translation;Search Engine | - |
dc.subject | Non-Parametric | - |
dc.subject | Translation Memory | - |
dc.title | Search engine guided neural machine translation | - |
dc.type | Conference_Paper | - |
dc.identifier.email | Li, VOK: vli@eee.hku.hk | - |
dc.identifier.authority | Li, VOK=rp00150 | - |
dc.identifier.hkuros | 292176 | - |
dc.identifier.spage | 5133 | - |
dc.identifier.epage | 5140 | - |
dc.publisher.place | United States | - |