File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1145/3340531.3411945
- Scopus: eid_2-s2.0-85095866400
- WOS: WOS:000749561301001
Supplementary
- Citations:
- Appears in Collections:
Conference Paper: Dynamic Representation Learning for Large-Scale Attributed Networks
Title | Dynamic Representation Learning for Large-Scale Attributed Networks |
---|---|
Authors | |
Keywords | dynamic networks large-scale attributed networks network representation learning sparse random projection |
Issue Date | 2020 |
Citation | International Conference on Information and Knowledge Management, Proceedings, 2020, p. 1005-1014 How to Cite? |
Abstract | Network embedding, which aims at learning low-dimensional representations of nodes in a network, has drawn much attention for various network mining tasks, ranging from link prediction to node classification. In addition to network topological information, there also exist rich attributes associated with network structure, which exerts large effects on the network formation. Hence, many efforts have been devoted to tackling attributed network embedding tasks. However, they are also limited in their assumption of static network data as they do not account for evolving network structure as well as changes in the associated attributes. Furthermore, scalability is a key factor when performing representation learning on large-scale networks with huge number of nodes and edges. In this work, we address these challenges by developing the DRLAN-Dynamic Representation Learning framework for large-scale Attributed Networks. The DRLAN model generalizes the dynamic attributed network embedding from two perspectives: First, we develop an integrative learning framework with an offline batch embedding module to preserve both the node and attribute proximities, and online network embedding model that recursively updates learned representation vectors. Second, we design a recursive pre-projection mechanism to efficiently model the attribute correlations based on the associative property of matrices. Finally, we perform extensive experiments on three real-world network datasets to show the superiority of DRLAN against state-of-the-art network embedding techniques in terms of both effectiveness and efficiency. The source code is available at: https://github.com/ZhijunLiu95/DRLAN. |
Persistent Identifier | http://hdl.handle.net/10722/308831 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Liu, Zhijun | - |
dc.contributor.author | Huang, Chao | - |
dc.contributor.author | Yu, Yanwei | - |
dc.contributor.author | Song, Peng | - |
dc.contributor.author | Fan, Baode | - |
dc.contributor.author | Dong, Junyu | - |
dc.date.accessioned | 2021-12-08T07:50:13Z | - |
dc.date.available | 2021-12-08T07:50:13Z | - |
dc.date.issued | 2020 | - |
dc.identifier.citation | International Conference on Information and Knowledge Management, Proceedings, 2020, p. 1005-1014 | - |
dc.identifier.uri | http://hdl.handle.net/10722/308831 | - |
dc.description.abstract | Network embedding, which aims at learning low-dimensional representations of nodes in a network, has drawn much attention for various network mining tasks, ranging from link prediction to node classification. In addition to network topological information, there also exist rich attributes associated with network structure, which exerts large effects on the network formation. Hence, many efforts have been devoted to tackling attributed network embedding tasks. However, they are also limited in their assumption of static network data as they do not account for evolving network structure as well as changes in the associated attributes. Furthermore, scalability is a key factor when performing representation learning on large-scale networks with huge number of nodes and edges. In this work, we address these challenges by developing the DRLAN-Dynamic Representation Learning framework for large-scale Attributed Networks. The DRLAN model generalizes the dynamic attributed network embedding from two perspectives: First, we develop an integrative learning framework with an offline batch embedding module to preserve both the node and attribute proximities, and online network embedding model that recursively updates learned representation vectors. Second, we design a recursive pre-projection mechanism to efficiently model the attribute correlations based on the associative property of matrices. Finally, we perform extensive experiments on three real-world network datasets to show the superiority of DRLAN against state-of-the-art network embedding techniques in terms of both effectiveness and efficiency. The source code is available at: https://github.com/ZhijunLiu95/DRLAN. | - |
dc.language | eng | - |
dc.relation.ispartof | International Conference on Information and Knowledge Management, Proceedings | - |
dc.subject | dynamic networks | - |
dc.subject | large-scale attributed networks | - |
dc.subject | network representation learning | - |
dc.subject | sparse random projection | - |
dc.title | Dynamic Representation Learning for Large-Scale Attributed Networks | - |
dc.type | Conference_Paper | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1145/3340531.3411945 | - |
dc.identifier.scopus | eid_2-s2.0-85095866400 | - |
dc.identifier.spage | 1005 | - |
dc.identifier.epage | 1014 | - |
dc.identifier.isi | WOS:000749561301001 | - |