File Download
Links for fulltext
(May Require Subscription)
- Scopus: eid_2-s2.0-85090173593
- WOS: WOS:000534424302070
- Find via
Supplementary
- Citations:
- Appears in Collections:
Conference Paper: Quaternion knowledge graph embeddings
Title | Quaternion knowledge graph embeddings |
---|---|
Authors | |
Issue Date | 2019 |
Citation | 33rd Annual Conference on Neural Information Processing Systems (NeurIPS 2019), Vancouver, 8-14 December 2019. In Advances in Neural Information Processing Systems, 2019, v. 32 How to Cite? |
Abstract | In this work, we move beyond the traditional complex-valued representations, introducing more expressive hypercomplex representations to model entities and relations for knowledge graph embeddings. More specifically, quaternion embeddings, hypercomplex-valued embeddings with three imaginary components, are utilized to represent entities. Relations are modelled as rotations in the quaternion space. The advantages of the proposed approach are: (1) Latent inter-dependencies (between all components) are aptly captured with Hamilton product, encouraging a more compact interaction between entities and relations; (2) Quaternions enable expressive rotation in four-dimensional space and have more degree of freedom than rotation in complex plane; (3) The proposed framework is a generalization of ComplEx on hypercomplex space while offering better geometrical interpretations, concurrently satisfying the key desiderata of relational representation learning (i.e., modeling symmetry, anti-symmetry and inversion). Experimental results demonstrate that our method achieves state-of-the-art performance on four well-established knowledge graph completion benchmarks. |
Persistent Identifier | http://hdl.handle.net/10722/321897 |
ISSN | 2020 SCImago Journal Rankings: 1.399 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Zhang, Shuai | - |
dc.contributor.author | Tay, Yi | - |
dc.contributor.author | Yao, Lina | - |
dc.contributor.author | Liu, Qi | - |
dc.date.accessioned | 2022-11-03T02:22:12Z | - |
dc.date.available | 2022-11-03T02:22:12Z | - |
dc.date.issued | 2019 | - |
dc.identifier.citation | 33rd Annual Conference on Neural Information Processing Systems (NeurIPS 2019), Vancouver, 8-14 December 2019. In Advances in Neural Information Processing Systems, 2019, v. 32 | - |
dc.identifier.issn | 1049-5258 | - |
dc.identifier.uri | http://hdl.handle.net/10722/321897 | - |
dc.description.abstract | In this work, we move beyond the traditional complex-valued representations, introducing more expressive hypercomplex representations to model entities and relations for knowledge graph embeddings. More specifically, quaternion embeddings, hypercomplex-valued embeddings with three imaginary components, are utilized to represent entities. Relations are modelled as rotations in the quaternion space. The advantages of the proposed approach are: (1) Latent inter-dependencies (between all components) are aptly captured with Hamilton product, encouraging a more compact interaction between entities and relations; (2) Quaternions enable expressive rotation in four-dimensional space and have more degree of freedom than rotation in complex plane; (3) The proposed framework is a generalization of ComplEx on hypercomplex space while offering better geometrical interpretations, concurrently satisfying the key desiderata of relational representation learning (i.e., modeling symmetry, anti-symmetry and inversion). Experimental results demonstrate that our method achieves state-of-the-art performance on four well-established knowledge graph completion benchmarks. | - |
dc.language | eng | - |
dc.relation.ispartof | Advances in Neural Information Processing Systems | - |
dc.title | Quaternion knowledge graph embeddings | - |
dc.type | Conference_Paper | - |
dc.description.nature | link_to_OA_fulltext | - |
dc.identifier.scopus | eid_2-s2.0-85090173593 | - |
dc.identifier.volume | 32 | - |
dc.identifier.isi | WOS:000534424302070 | - |