Abstract
Graph contrastive learning has become an important approach for learning unsupervised representations of graphs, with the key idea of maximizing the consistency of representations in both augmented views through data augmentation. Existing graph contrastive learning models concentrate on topology enhancement of the graph structure by simply removing/adding edges between nodes randomly, which may not only destroy the important structure of the graph but also generate meaningless graphs, leading to a significant degradation of the contrastive learning performance. In addition, current research is too focused on minimizing losses, while ignoring the intrinsic factors that can affect the quality of node representations, which is detrimental to the training of models and the generation of high-quality node representations. To address these issues, we propose intrinsic augmented graph contrastive learning, named IAG, which consists of two components: 1) In the topology augmentation part, we propose a novel topology augmentation strategy based on potential connections in the feature space, which complements the traditional topology augmentation by allowing different graphs to obtain augmentation strategies more suitable for their own characteristics to advance the traditional graph contrastive learning. 2) We explored the effect of temperature coefficient in the loss function on the quality of the final representation and proposed dynamic temperature with penalty terms, which helps to generate high-quality node representations. Finally, we conducted extensive node classification experiments on 8 real-world datasets. The experimental results show that our proposed method is highly competitive with the existing state-of-the-art baselines and even surpasses some supervised methods.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Bachman, P., Hjelm, R.D., Buchwalter, W.: Learning representations by maximizing mutual information across views. Adv. Neural Inf. Process. Syst. 32 (2019)
Grover, A., Leskovec, J.: node2vec: scalable feature learning for networks. In: Proceedings of the 22nd International Conference on Knowledge Discovery and Data Mining, pp. 855–864 (2016)
Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126. PMLR (2020)
Horn, R.A., Johnson, C.R.: Matrix Analysis. Cambridge University Press, Cambridge (2012)
**, W., et al.: Self-supervised learning on graphs: deep insights and new direction. ar**v preprint ar**v:2006.10141 (2020)
Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. ar**v preprint ar**v:1609.02907 (2016)
Kipf, T.N., Welling, M.: Variational graph auto-encoders. ar**v preprint ar**v:1611.07308 (2016)
Klicpera, J., Weißenberger, S., Günnemann, S.: Diffusion improves graph learning. ar**v preprint ar**v:1911.05485 (2019)
Linsker, R.: Self-organization in a perceptual network. Computer 21(3), 105–117 (1988)
Liu, N., Tan, Q., Li, Y., Yang, H., Zhou, J., Hu, X.: Is a single vector enough? Exploring node polysemy for network embedding. In: Proceedings of the 25th International Conference on Knowledge Discovery & Data Mining, pp. 932–940 (2019)
Liu, X., et al.: Self-supervised learning: generative or contrastive. IEEE Trans. Knowl. Data Eng. 35, 857–876 (2021)
Van der Maaten, L., Hinton, G.: Visualizing data using t-SNE. J. Mach. Learn. Res. 9(11) (2008)
Mernyei, P., Cangea, C.: Wiki-CS: a Wikipedia-based benchmark for graph neural networks. ar**v preprint ar**v:2007.02901 (2020)
Neyshabur, B., Khadem, A., Hashemifar, S., Arab, S.S.: NETAL: a new graph-based method for global alignment of protein-protein interaction networks. Bioinformatics 29(13), 1654–1662 (2013)
Page, L., Brin, S., Motwani, R., Winograd, T.: The PageRank citation ranking: bringing order to the web. Technical report, Stanford InfoLab (1999)
Peng, Z., et al.: Graph representation learning via graphical mutual information maximization. In: Proceedings of The Web Conference 2020, pp. 259–270 (2020)
Perozzi, B., Al-Rfou, R., Skiena, S.: DeepWalk: online learning of social representations. In: Proceedings of the 20th International Conference on Knowledge Discovery and Data Mining, pp. 701–710 (2014)
Sen, P., Namata, G., Bilgic, M., Getoor, L., Galligher, B., Eliassi-Rad, T.: Collective classification in network data. AI Mag. 29(3), 93–93 (2008)
Shchur, O., Mumme, M., Bojchevski, A., Günnemann, S.: Pitfalls of graph neural network evaluation. ar**v preprint ar**v:1811.05868 (2018)
Veličković, P., Cucurull, G., Casanova, A., Romero, A., Lio, P., Bengio, Y.: Graph attention networks. ar**v preprint ar**v:1710.10903 (2017)
Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. In: ICLR (Poster), vol. 2, no. 3, p. 4 (2019)
Wang, F., Liu, H.: Understanding the behaviour of contrastive loss. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 2495–2504 (2021)
Wang, T., Isola, P.: Understanding contrastive representation learning through alignment and uniformity on the hypersphere. In: International Conference on Machine Learning, pp. 9929–9939. PMLR (2020)
Wang, X., Zhu, M., Bo, D., Cui, P., Shi, C., Pei, J.: AM-GCN: adaptive multi-channel graph convolutional networks. In: Proceedings of the 26th International Conference on Knowledge Discovery & Data Mining, pp. 1243–1253 (2020)
Wu, M., Zhuang, C., Mosse, M., Yamins, D., Goodman, N.: On mutual information in contrastive learning for visual representations. ar**v preprint ar**v:2005.13149 (2020)
Yang, J., Leskovec, J.: Defining and evaluating network communities based on ground-truth. In: Proceedings of the ACM SIGKDD Workshop on Mining Data Semantics, pp. 1–8 (2012)
You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Adv. Neural. Inf. Process. Syst. 33, 5812–5823 (2020)
Zhang, H., Wu, Q., Yan, J., Wipf, D., Yu, P.S.: From canonical correlation analysis to self-supervised graph neural networks. Adv. Neural. Inf. Process. Syst. 34, 76–89 (2021)
Zhu, Y., Xu, Y., Liu, Q., Wu, S.: An empirical study of graph contrastive learning. ar**v preprint ar**v:2109.01116 (2021)
Zhu, Y., Xu, Y., Yu, F., Liu, Q., Wu, S., Wang, L.: Deep graph contrastive representation learning. ar**v preprint ar**v:2006.04131 (2020)
Zhu, Y., Xu, Y., Yu, F., Liu, Q., Wu, S., Wang, L.: Graph contrastive learning with adaptive augmentation. In: Proceedings of the Web Conference 2021, pp. 2069–2080 (2021)
Acknowledgement
This study is funded in part by the National Natural Science Foundation of China (No. 61906002, 62076005, U20A20398), the Natural Science Foundation of Anhui Province (2008085QF306, 2008085MF191, 2008085UD07), and the University Synergy Innovation Program of Anhui Province, China (GXXT-2021-002).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Sun, D., Cao, M., Ding, Z., Luo, B. (2023). Graph Contrastive Learning with Intrinsic Augmentations. In: Pan, L., Zhao, D., Li, L., Lin, J. (eds) Bio-Inspired Computing: Theories and Applications. BIC-TA 2022. Communications in Computer and Information Science, vol 1801. Springer, Singapore. https://doi.org/10.1007/978-981-99-1549-1_27
Download citation
DOI: https://doi.org/10.1007/978-981-99-1549-1_27
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-1548-4
Online ISBN: 978-981-99-1549-1
eBook Packages: Computer ScienceComputer Science (R0)