Log in

HHSE: heterogeneous graph neural network via higher-order semantic enhancement

  • Regular Article
  • Published:
Computing Aims and scope Submit manuscript

Abstract

Heterogeneous graph representation learning has strong expressiveness when dealing with large-scale relational graph data, and its purpose is to effectively represent the semantic information and heterogeneous structure information of nodes in the graph. Current methods typically use shallow models to embed semantic information on low-order neighbor nodes in the graph, which prevents the complete retention of higher-order semantic feature information. To address this issue, this paper proposes a heterogeneous graph network for higher-order semantic enhancement called HHSE. Specifically, our model uses the identity map** mechanism of residual attention at the node feature level to enhance the information representation of nodes in the hidden layer, and then utilizes two aggregation strategies to improve the retention of high-order semantic information. The semantic feature level aims to learn the semantic information of nodes in various meta path subgraphs. Extensive experiments on node classification and node clustering on three real-existing datasets show that the proposed approach makes practical improvements compared to the state-of-the-art methods. Besides, our method is applicable to large-scale heterogeneous graph representation learning.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Availability of data and materials

The three datasets used in this paper are available from DBLP: https://dblp.uni-trier.de; ACM: http://dl.acm.org; IMDB: https://www.imdb.com and are referenced in the text where relevant.

References

  1. Atwood J, Towsley D (2016) Diffusion-convolutional neural networks. Adv Neural Inf Process Syst, 29

  2. Wang D, Cui P, Zhu W (2016) Structural deep network embedding. In: Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining, pp 1225–1234

  3. Kipf TN, Welling M (2016) Semi-supervised classification with graph convolutional networks. ar**v preprint ar**v:1609.02907

  4. Han J, Sun Y, Yan X, Yu PS (2010) Mining heterogeneous information networks. In: Tutorial at the 2010 ACM SIGKDD conference on knowledge discovery and data mining (KDD’10), Washington, DC

  5. Tajeuna EG, Bouguessa M, Wang S (2018) Modeling and predicting community structure changes in time-evolving social networks. IEEE Trans Knowl Data Eng 31(6):1166–1180

    Article  Google Scholar 

  6. Bordes A, Usunier N, Garcia-Duran A, Weston J, Yakhnenko O (2013) Translating embeddings for modeling multi-relational data. Adv Neural Inf Process Syst 26

  7. Sun Z, Deng Z-H, Nie J-Y, Tang J (2019) Rotate: knowledge graph embedding by relational rotation in complex space. ar**v preprint ar**v:1902.10197

  8. Hong H, Guo H, Lin Y, Yang X, Li Z, Ye J (2020) An attention-based graph neural network for heterogeneous structural learning. In: Proceedings of the AAAI conference on artificial intelligence, vol 34, pp 4132–4139

  9. Zhang C, Swami A, Chawla NV (2019) Shne: representation learning for semantic-associated heterogeneous networks. In: Proceedings of the twelfth ACM international conference on web search and data mining, pp 690–698

  10. Wang X, He X, Wang M, Feng F, Chua T-S (2019) Neural graph collaborative filtering. In: Proceedings of the 42nd international ACM SIGIR conference on research and development in information retrieval, pp 165–174

  11. Tang J, Qu M, Wang M, Zhang M, Yan J, Mei Q (2015) Line: Large-scale information network embedding. In: Proceedings of the 24th international conference on world wide web, pp 1067–1077

  12. Perozzi B, Al-Rfou R, Skiena S (2014) Deepwalk: Online learning of social representations. In: Proceedings of the 20th ACM SIGKDD international conference on knowledge discovery and data mining, pp 701–710

  13. Grover A, Leskovec J (2016) node2vec: Scalable feature learning for networks. In: Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining, pp 855–864

  14. Goldberg Y, Levy O (2014) word2vec explained: deriving mikolov et al.’s negative-sampling word-embedding method. ar**v preprint ar**v:1402.3722

  15. Defferrard M, Bresson X, Vandergheynst P (2016) Convolutional neural networks on graphs with fast localized spectral filtering. Adv Neural Inf Process Syst, 29

  16. Hamilton W, Ying Z, Leskovec J (2017) Inductive representation learning on large graphs. Adv Neural Inf Process Syst, 30

  17. Zhang J, Shi X, **e J, Ma H, King I, Yeung D-Y (2018) Gaan: gated attention networks for learning on large and spatiotemporal graphs. ar**v preprint ar**v:1803.07294

  18. Zhang J, Shi X, Zhao S, King I (2019) Star-gcn: stacked and reconstructed graph convolutional networks for recommender systems. ar**v preprint ar**v:1905.13129

  19. Li Y, Tarlow D, Brockschmidt M, Zemel R (2015) Gated graph sequence neural networks. ar**v preprint ar**v:1511.05493

  20. Veličković P, Cucurull G, Casanova A, Romero A, Lio P, Bengio Y (2017) Graph attention networks. ar**v preprint ar**v:1710.10903

  21. Shi C, Li Y, Zhang J, Sun Y, Philip SY (2016) A survey of heterogeneous information network analysis. IEEE Trans Knowl Data Eng 29(1):17–37

    Article  Google Scholar 

  22. Gori M, Monfardini G, Scarselli F (2005) A new model for learning in graph domains. In: Proceedings. 2005 IEEE International joint conference on neural networks, vol 2, pp 729–734

  23. Shang J, Qu M, Liu J, Kaplan LM, Han J, Peng J (2016) Meta-path guided embedding for similarity search in large-scale heterogeneous information networks. ar**v preprint ar**v:1610.09769

  24. Yun S, Jeong M, Kim R, Kang J, Kim HJ (2019) Graph transformer networks. Adv Neural Inf Process Syst, 32

  25. Hu Z, Dong Y, Wang K, Sun Y (2020) Heterogeneous graph transformer. In: Proceedings of the web conference 2020, pp 2704–2710

  26. Zhang C, Song D, Huang C, Swami A, Chawla NV (2019) Heterogeneous graph neural network. In: Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery and data mining, pp 793–803

  27. Dong Y, Chawla NV, Swami A (2017) metapath2vec: Scalable representation learning for heterogeneous networks. In: Proceedings of the 23rd ACM SIGKDD international conference on knowledge discovery and data mining, pp 135–144

  28. Wang X, Ji H, Shi C, Wang B, Ye Y, Cui P, Yu PS (2019) Heterogeneous graph attention network. In: The World wide web conference, pp 2022–2032

  29. Schlichtkrull M, Kipf TN, Bloem P, Berg Rvd, Titov I, Welling M (2018) Modeling relational data with graph convolutional networks. In: European semantic web conference, pp 593–607. Springer

  30. Sun Y, Han J, Yan X, Yu PS, Wu T (2011) Pathsim: meta path-based top-k similarity search in heterogeneous information networks. Proc VLDB Endow 4(11):992–1003

    Article  Google Scholar 

  31. Scarselli F, Gori M, Tsoi AC, Hagenbuchner M, Monfardini G (2008) The graph neural network model. IEEE Trans Neural Netw 20(1):61–80

    Article  PubMed  Google Scholar 

  32. Krizhevsky A, Sutskever I, Hinton GE (2017) Imagenet classification with deep convolutional neural networks. Commun ACM 60(6):84–90

    Article  Google Scholar 

  33. Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser Ł, Polosukhin I (2017) Attention is all you need. Adv Neural Inf Process Syst, 30

  34. Cho K, Van Merriënboer B, Gulcehre C, Bahdanau D, Bougares F, Schwenk H, Bengio Y (2014) Learning phrase representations using rnn encoder-decoder for statistical machine translation. ar**v preprint ar**v:1406.1078

  35. Chen H, Yin H, Wang W, Wang H, Nguyen QVH, Li X (2018) Pme: projected metric embedding on heterogeneous networks for link prediction. In: Proceedings of the 24th ACM SIGKDD international conference on knowledge discovery and data mining, pp 1177–1186

  36. Sun L, He L, Huang Z, Cao B, **a C, Wei X, Philip SY (2018) Joint embedding of meta-path and meta-graph for heterogeneous information networks. In: 2018 IEEE international conference on big knowledge (ICBK), pp 131–138. IEEE

  37. Fu X, Zhang J, Meng Z, King I (2020) Magnn: metapath aggregated graph neural network for heterogeneous graph embedding. In: Proceedings of the web conference 2020, pp 2331–2341

  38. Li W, Ni L, Wang J, Wang C (2022) Collaborative representation learning for nodes and relations via heterogeneous graph neural network. Knowl-Based Syst 255:109673

    Article  Google Scholar 

  39. Wang Z, Yu D, Li Q, Shen S, Yao S (2023) Sr-hgn: semantic-and relation-aware heterogeneous graph neural network. Expert Syst Appl 224:119982

    Article  Google Scholar 

  40. Han M, Zhang H, Li W, Yin Y (2023) Semantic-guided graph neural network for heterogeneous graph embedding. Expert Syst Appl, 120810

  41. Zhao Y, Li W, Liu F, Wang J, Luvembe AM (2024) Integrating heterogeneous structures and community semantics for unsupervised community detection in heterogeneous networks. Expert Syst Appl 238:121821

    Article  Google Scholar 

  42. He Y, Yan D, Zhang Y, He Q, Yang Y (2022) Semantic tradeoff for heterogeneous graph embedding. IEEE Trans Comput Soc Syst

  43. Kingma DP, Ba J (2014) Adam: a method for stochastic optimization. ar**v preprint ar**v:1412.6980

Download references

Acknowledgements

We thank the High Performance Computing Research Department of the Gansu Provincial Computing Center, China, for providing computing services to support this work.

Funding

This research was supported by the National Science and Natural Foundation of China [No. 61962054].

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Cuntao Ma.

Ethics declarations

Conflict of interest

The author declares that there is no conflict of interest in the publication of this article.

Ethics approval

Meet the requirements.

Consent to participate

Meet the requirements.

Consent for publication

Meet the requirements.

Code availability

The code is available, but currently not uploaded to the online platform.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Du, H., Ma, C., Lu, D. et al. HHSE: heterogeneous graph neural network via higher-order semantic enhancement. Computing 106, 865–887 (2024). https://doi.org/10.1007/s00607-023-01246-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00607-023-01246-x

Keywords

Mathematics Subject Classification

Navigation