File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.18653/v1/p18-1030
- Scopus: eid_2-s2.0-85063075441
- WOS: WOS:000493904300030
Supplementary
- Citations:
- Appears in Collections:
Conference Paper: Sentence-state LSTM for text representation
Title | Sentence-state LSTM for text representation |
---|---|
Authors | |
Issue Date | 2018 |
Citation | ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers), 2018, v. 1, p. 317-327 How to Cite? |
Abstract | Bi-directional LSTMs are a powerful tool for text representation. On the other hand, they have been shown to suffer various limitations due to their sequential nature. We investigate an alternative LSTM structure for encoding text, which consists of a parallel state for each word. Recurrent steps are used to perform local and global information exchange between words simultaneously, rather than incremental reading of a sequence of words. Results on various classification and sequence labelling benchmarks show that the proposed model has strong representation power, giving highly competitive performances compared to stacked BiLSTM models with similar parameter numbers. |
Persistent Identifier | http://hdl.handle.net/10722/321838 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Zhang, Yue | - |
dc.contributor.author | Liu, Qi | - |
dc.contributor.author | Song, Linfeng | - |
dc.date.accessioned | 2022-11-03T02:21:47Z | - |
dc.date.available | 2022-11-03T02:21:47Z | - |
dc.date.issued | 2018 | - |
dc.identifier.citation | ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers), 2018, v. 1, p. 317-327 | - |
dc.identifier.uri | http://hdl.handle.net/10722/321838 | - |
dc.description.abstract | Bi-directional LSTMs are a powerful tool for text representation. On the other hand, they have been shown to suffer various limitations due to their sequential nature. We investigate an alternative LSTM structure for encoding text, which consists of a parallel state for each word. Recurrent steps are used to perform local and global information exchange between words simultaneously, rather than incremental reading of a sequence of words. Results on various classification and sequence labelling benchmarks show that the proposed model has strong representation power, giving highly competitive performances compared to stacked BiLSTM models with similar parameter numbers. | - |
dc.language | eng | - |
dc.relation.ispartof | ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers) | - |
dc.title | Sentence-state LSTM for text representation | - |
dc.type | Conference_Paper | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.18653/v1/p18-1030 | - |
dc.identifier.scopus | eid_2-s2.0-85063075441 | - |
dc.identifier.volume | 1 | - |
dc.identifier.spage | 317 | - |
dc.identifier.epage | 327 | - |
dc.identifier.isi | WOS:000493904300030 | - |