Boosting Graph Contrastive Learning via Adaptive Sampling

No Thumbnail Available
File version
Author(s)
Wan, Sheng
Zhan, Yibing
Chen, Shuo
Pan, Shirui
Yang, Jian
Tao, Dacheng
Gong, Chen
Griffith University Author(s)
Primary Supervisor
Other Supervisors
Editor(s)
Date
2023
Size
File type(s)
Location
License
Abstract

Contrastive learning (CL) is a prominent technique for self-supervised representation learning, which aims to contrast semantically similar (i.e., positive) and dissimilar (i.e., negative) pairs of examples under different augmented views. Recently, CL has provided unprecedented potential for learning expressive graph representations without external supervision. In graph CL, the negative nodes are typically uniformly sampled from augmented views to formulate the contrastive objective. However, this uniform negative sampling strategy limits the expressive power of contrastive models. To be specific, not all the negative nodes can provide sufficiently meaningful knowledge for effective contrastive representation learning. In addition, the negative nodes that are semantically similar to the anchor are undesirably repelled from it, leading to degraded model performance. To address these limitations, in this article, we devise an adaptive sampling strategy termed “AdaS.” The proposed AdaS framework can be trained to adaptively encode the importance of different negative nodes, so as to encourage learning from the most informative graph nodes. Meanwhile, an auxiliary polarization regularizer is proposed to suppress the adverse impacts of the false negatives and enhance the discrimination ability of AdaS. The experimental results on a variety of real-world datasets firmly verify the effectiveness of our AdaS in improving the performance of graph CL.

Journal Title

IEEE Transactions on Neural Networks and Learning Systems

Conference Title
Book Title
Edition
Volume
Issue
Thesis Type
Degree Program
School
Publisher link
Patent number
Funder(s)
Grant identifier(s)
Rights Statement
Rights Statement
Item Access Status
Note

This publication has been entered in Griffith Research Online as an advanced online version.

Access the data
Related item(s)
Subject

Machine learning

Computer Science

Computer Science, Artificial Intelligence

Computer Science, Hardware & Architecture

Computer Science, Theory & Methods

Contrastive graph representation learning

Persistent link to this record
Citation

Wan, S; Zhan, Y; Chen, S; Pan, S; Yang, J; Tao, D; Gong, C, Boosting Graph Contrastive Learning via Adaptive Sampling, IEEE Transactions on Neural Networks and Learning Systems, 2023

Collections