Exploiting centrality information with graph convolutions for network representation learning
Author(s)
Chen, H
Yin, H
Chen, T
Nguyen, QVH
Peng, WC
Li, X
Griffith University Author(s)
Year published
2019
Metadata
Show full item recordAbstract
Network embedding has been proven effective to learn low-dimensional vector representations for network vertices, and recently received a tremendous amount of research attention. However, most of existing methods for network embedding merely focus on preserving the first and second order proximities between nodes, and the important properties of node centrality are neglected. Various centrality measures such as Degree, Closeness, Betweenness, Eigenvector and PageRank centralities have been designed to measure the importance of individual nodes. In this paper, we focus on a novel yet unsolved problem that aims to learn ...
View more >Network embedding has been proven effective to learn low-dimensional vector representations for network vertices, and recently received a tremendous amount of research attention. However, most of existing methods for network embedding merely focus on preserving the first and second order proximities between nodes, and the important properties of node centrality are neglected. Various centrality measures such as Degree, Closeness, Betweenness, Eigenvector and PageRank centralities have been designed to measure the importance of individual nodes. In this paper, we focus on a novel yet unsolved problem that aims to learn low-dimensional continuous nodes representations that not only preserve the network structure, but also keep the centrality information. We propose a generalizable model, namely GraphCSC, that utilizes both linkage information and centrality information to learn low-dimensional vector representations for network vertices. The learned embeddings by GraphCSC are able to preserve different centrality information of nodes. In addition, we further propose GraphCSC-M, a more comprehensive model that can preserve different centrality information simultaneously through learning multiple centrality-specific embeddings, and a novel attentive multi-view learning approach is developed to compress multiple embeddings of one node into a compact vector representation. Extensive experiments have been conducted to demonstrate that our model is able to preserve different centrality information of nodes, and achieves better performance on several benchmark tasks compared with recent state-of-the-art network embedding methods.
View less >
View more >Network embedding has been proven effective to learn low-dimensional vector representations for network vertices, and recently received a tremendous amount of research attention. However, most of existing methods for network embedding merely focus on preserving the first and second order proximities between nodes, and the important properties of node centrality are neglected. Various centrality measures such as Degree, Closeness, Betweenness, Eigenvector and PageRank centralities have been designed to measure the importance of individual nodes. In this paper, we focus on a novel yet unsolved problem that aims to learn low-dimensional continuous nodes representations that not only preserve the network structure, but also keep the centrality information. We propose a generalizable model, namely GraphCSC, that utilizes both linkage information and centrality information to learn low-dimensional vector representations for network vertices. The learned embeddings by GraphCSC are able to preserve different centrality information of nodes. In addition, we further propose GraphCSC-M, a more comprehensive model that can preserve different centrality information simultaneously through learning multiple centrality-specific embeddings, and a novel attentive multi-view learning approach is developed to compress multiple embeddings of one node into a compact vector representation. Extensive experiments have been conducted to demonstrate that our model is able to preserve different centrality information of nodes, and achieves better performance on several benchmark tasks compared with recent state-of-the-art network embedding methods.
View less >
Conference Title
Proceedings - International Conference on Data Engineering
Volume
2019-April
Subject
Distributed computing and systems software
Science & Technology
Computer Science, Information Systems