A Comprehensive Survey on Distributed Training of Graph Neural Networks
Files
File version
Accepted Manuscript (AM)
Author(s)
Yan, M
Ye, X
Fan, D
Pan, S
Chen, W
Xie, Y
Griffith University Author(s)
Primary Supervisor
Other Supervisors
Editor(s)
Date
Size
File type(s)
Location
License
Abstract
Graph neural networks (GNNs) have been demonstrated to be a powerful algorithmic model in broad application fields for their effectiveness in learning over graphs. To scale GNN training up for large-scale and ever-growing graphs, the most promising solution is distributed training that distributes the workload of training across multiple computing nodes. At present, the volume of related research on distributed GNN training is exceptionally vast, accompanied by an extraordinarily rapid pace of publication. Moreover, the approaches reported in these studies exhibit significant divergence. This situation poses a considerable challenge for newcomers, hindering their ability to grasp a comprehensive understanding of the workflows, computational patterns, communication strategies, and optimization techniques employed in distributed GNN training. As a result, there is a pressing need for a survey to provide correct recognition, analysis, and comparisons in this field. In this article, we provide a comprehensive survey of distributed GNN training by investigating various optimization techniques used in distributed GNN training. First, distributed GNN training is classified into several categories according to their workflows. In addition, their computational patterns and communication patterns, as well as the optimization techniques proposed by recent work, are introduced. Second, the software frameworks and hardware platforms of distributed GNN training are also introduced for a deeper understanding. Third, distributed GNN training is compared with distributed training of deep neural networks (DNNs), emphasizing the uniqueness of distributed GNN training. Finally, interesting issues and opportunities in this field are discussed.
Journal Title
Proceedings of the IEEE
Conference Title
Book Title
Edition
Volume
111
Issue
12
Thesis Type
Degree Program
School
Publisher link
Patent number
Funder(s)
Grant identifier(s)
Rights Statement
Rights Statement
This work is covered by copyright. You must assume that re-use is limited to personal use and that permission from the copyright owner must be obtained for all other uses. If the document is available under a specified licence, refer to the licence for details of permitted re-use. If you believe that this work infringes copyright please make a copyright takedown request using the form at https://www.griffith.edu.au/copyright-matters.
Item Access Status
Note
Access the data
Related item(s)
Subject
Electronics, sensors and digital hardware
Persistent link to this record
Citation
Lin, H; Yan, M; Ye, X; Fan, D; Pan, S; Chen, W; Xie, Y, A Comprehensive Survey on Distributed Training of Graph Neural Networks, Proceedings of the IEEE, 2023, 111 (12), pp. 1572-1606