A Review of Hypergraph Neural Networks
DOI:
https://doi.org/10.4108/eetel.7064Keywords:
Graph Neural Networks, Hypergraph Neural Networks, Graph Structure, Hypergraph StructureAbstract
In recent years, Graph Neural Networks (GNNs) have seen notable success in fields such as recommendation systems and natural language processing, largely due to the availability of vast amounts of data and powerful computational resources. GNNs are primarily designed to work with graph data that involve pairwise relationships. However, in many real-world networks, the relationships between entities are complex and go beyond simple pairwise connections, as seen in scientific collaboration networks, protein networks, and similar domains. If these complex relationships are directly represented as pairwise relationships using graph structures, it can lead to information loss. A hypergraph, as a special kind of graph-structured data, can represent higher-order relationships that cannot be fully captured by graphs, thereby addressing the limitations of graphs. In light of this, researchers have begun to focus on how to design neural networks on hypergraphs, leading to the proposal of hypergraph neural network (HGNN) models for downstream tasks. Therefore, this paper reviews the existing hypergraph neural network models. The review is conducted from two perspectives: spectral analysis methods and neural network methods on hypergraphs, discussing both unfolded and non-unfolded methods, and further subdividing them based on their algorithm characteristics and application scenarios. Subsequently, the design concepts of various algorithms are analyzed and compared, and the advantages and disadvantages of each type of algorithm are summarized based on experimental results. Finally, potential future research directions in hypergraph learning are discussed.
References
[1] KRIZHEVSKY A, SUTSKEVER I, HINTON G E. ImageNet classification with deep convolutional neural networks[J/OL]. Communications of the ACM, 2017, 60(6): 84-90.
[2] ELMAN J L. Distributed representations, simple recurrent networks, and grammatical structure[J]. Machine learning, 1991, 7: 195-225.
[3] LECUN Y, BENGIO Y, HINTON G. Deep learning[J]. nature, 2015, 521(7553): 436-444.
[4] GRAVES A, GRAVES A. Long short-term memory[J]. Supervised sequence labelling with recurrent neural networks, 2012: 37-45.
[5] CHUNG J. Empirical evaluation of gated recurrent neural networks on sequence modeling[J]. arXiv preprint arXiv:1412.3555, 2014.
[6] BRUNA J, ZAREMBA W, SZLAM A, et al.. Spectral Networks and Locally Connected Networks on Graphs[M/OL]. arXiv, 2014[2023-06-03].
[7] AGARWAL S, BRANSON K, BELONGIE S. Higher order learning with graphs[C]//Proceedings of the 23rd international conference on Machine learning. 2006: 17-24.
[8] JIANG J, WEI Y, FENG Y, et al.. Dynamic Hypergraph Neural Networks[C/OL]//Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence. Macao, China: International Joint Conferences on Artificial Intelligence Organization, 2019: 2635-2641[2023-05-24].
[9] TU K, CUI P, WANG X, et al.. Structural deep embedding for hyper-networks[C]//Proceedings of the AAAI conference on artificial intelligence: 32. 2018.
[10] ZHOU D, HUANG J, SCHÖLKOPF B. Learning with Hypergraphs: Clustering, Classification, and Embedding[M/OL]//SCHÖLKOPF B, PLATT J, HOFMANN T. Advances in Neural Information Processing Systems 19. The MIT Press, 2007: 1601-1608[2023-07-20].
[11] LIU Y, SHAO J, XIAO J, et al.. Hypergraph spectral hashing for image retrieval with heterogeneous social contexts[J]. Neurocomputing, 2013, 119: 49-58.
[12] WU F, HAN Y H, ZHUANG Y T. Multiple hypergraph clustering of web images by miningword2image correlations[J]. Journal of Computer Science and Technology, 2010, 25(4): 750-760.
[13] GUI H, LIU J, TAO F, et al.. Large-scale embedding learning in heterogeneous event data[C]//2016 IEEE 16th International Conference on Data Mining (ICDM). IEEE, 2016: 907-912.
[14] ZHANG R, ZOU Y, MA J. Hyper-SAGNN: a self-attention based graph neural network for hypergraphs[M/OL]. arXiv, 2019[2023-05-24].
[15] SUN L, JI S, YE J. Hypergraph spectral learning for multi-label classification[C]//Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining. 2008: 668-676.
[16] KIPF T N, WELLING M. Semi-Supervised Classification with Graph Convolutional Networks[M/OL]. arXiv, 2017[2023-06-05].
[17] DEFFERRARD M, BRESSON X, VANDERGHEYNST P. Convolutional neural networks on graphs with fast localized spectral filtering[J]. Advances in neural information processing systems, 2016, 29.
[18] CARLETTI T, BATTISTON F, CENCETTI G, et al.. Random walks on hypergraphs[J]. Physical review E, 2020, 101(2): 022308.
[19] BOLLA M. Spectra, euclidean representations and clusterings of hypergraphs[J]. Discrete Mathematics, 1993, 117(1-3): 19-39.
[20] HUANG Y, LIU Q, METAXAS D. ]Video object segmentation by hypergraph cut[C/OL]//2009 IEEE Conference on Computer Vision and Pattern Recognition. Miami, FL: IEEE, 2009: 1738-1745[2023-05-24].
[21] PURKAIT P, CHIN T J, SADRI A, et al.. Clustering with hypergraphs: the case for large hyperedges[J]. IEEE transactions on pattern analysis and machine intelligence, 2016, 39(9): 1697-1711.
[22] HUANG Y, LIU Q, ZHANG S, et al.. Image retrieval via probabilistic hypergraph ranking[C]//2010 IEEE computer society conference on computer vision and pattern recognition. IEEE, 2010: 3376-3383.
[23] MA X, LIU W, LI S, et al.. Hypergraph p -Laplacian regularization for remotely sensed image recognition[J]. IEEE Transactions on Geoscience and Remote Sensing, 2018, 57(3): 1585-1595.
[24] HUANG J, CHEN C, YE F, et al.. Hyper2vec: Biased random walk for hyper-network embedding[C]//Database Systems for Advanced Applications: DASFAA 2019 International Workshops: BDMS, BDQM, and GDMA, Chiang Mai, Thailand, April 22–25, 2019, Proceedings 24. Springer, 2019: 273-277.
[25] FENG Y, YOU H, ZHANG Z, et al.. Hypergraph Neural Networks[M/OL]. arXiv, 2019[2023-05-24].
[26] JI S, FENG Y, JI R, et al.. Dual Channel Hypergraph Collaborative Filtering[C/OL]//Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. Virtual Event CA USA: ACM, 2020: 2020-2029[2023-05-24].
[27] YI J, PARK J. Hypergraph convolutional recurrent neural network[C]//Proceedings of the 26th ACM SIGKDD international conference on knowledge discovery & data mining. 2020: 3366-3376.
[28] YADATI N, NIMISHAKAVI M, YADAV P, et al.. Hypergcn: A new method for training graph convolutional networks on hypergraphs[J]. Advances in neural information processing systems, 2019, 32.
[29] LOUIS A. Hypergraph markov operators, eigenvalues and approximation algorithms[C]//Proceedings of the forty-seventh annual ACM symposium on Theory of computing. 2015: 713-722.
[30] CHAN T H H, LIANG Z. Generalizing the hypergraph laplacian via a diffusion process with mediators[J]. Theoretical Computer Science, 2020, 806: 416-428.
[31] VASWANI A. Attention is all you need[J]. arXiv preprint arXiv:1706.03762, 2017.
[32] VELIČKOVIĆ P, CUCURULL G, CASANOVA A, et al.. Graph attention networks[J]. arXiv preprint arXiv:1710.10903, 2017.
[33] BAI S, ZHANG F, TORR P H. Hypergraph convolution and hypergraph attention[J]. Pattern Recognition, 2021, 110: 107637.
[34] TAN S, BU J, CHEN C, et al.. Using rich social media information for music recommendation via hypergraph model[J]. ACM Transactions on Multimedia Computing, Communications, and Applications (TOMM), 2011, 7(1): 1-22.
[35] ZHU Y, GUAN Z, TAN S, et al.. Heterogeneous hypergraph embedding for document recommendation[J]. Neurocomputing, 2016, 216: 150-162.
[36] BU J, TAN S, CHEN C, et al.. Music recommendation by unified hypergraph: combining social media information and music content[C]//Proceedings of the 18th ACM international conference on Multimedia. 2010: 391-400.
[37] HAN J, CHENG B, WANG X. Two-Phase Hypergraph Based Reasoning with Dynamic Relations for Multi-Hop KBQA.[C]//IJCAI. 2020: 3615-3621.
[38] WANG J, DING K, ZHU Z, et al.. Session-based recommendation with hypergraph attention networks[C]//Proceedings of the 2021 SIAM international conference on data mining (SDM). SIAM, 2021: 82-90.
[39] DING M, LIN X, ZENG B, et al.. Hypergraph neural networks with attention mechanism for session-based recommendation[C]//Journal of Physics: Conference Series: 2082. IOP Publishing, 2021: 012007.
[40] LIU Q, ZENG Y, MOKHOSI R, et al.. STAMP: short-term attention/memory priority model for session-based recommendation[C]//Proceedings of the 24th ACM SIGKDD international conference on knowledge discovery & data mining. 2018: 1831-1839.
[41] WU S, TANG Y, ZHU Y, et al.. Session-based recommendation with graph neural networks[C]//Proceedings of the AAAI conference on artificial intelligence: 33. 2019: 346-353.
[42] WANG M, LIU X, WU X. Visual Classification by $ell _1$ -Hypergraph Modeling[J/OL]. IEEE Transactions on Knowledge and Data Engineering, 2015, 27(9): 2564-2574.
[43] ZHANG Z, LIN H, GAO Y, et al.. Dynamic hypergraph structure learning.[C]//IJCAI. 2018: 3162-3169.
[44] JI R, CHEN F, CAO L, et al.. Cross-modality microblog sentiment prediction via bi-layer multimodal hypergraph learning[J]. IEEE Transactions on Multimedia, 2018, 21(4): 1062-1075.
[45] TSUYUZAKI K, ISHII M, NIKAIDO I. Uncovering hypergraphs of cell-cell interaction from single cell RNA-sequencing data[J]. BioRxiv, 2019: 566182.
[46] YU L, SHEN X, JIANG X, et al.. Hypergraph clustering based on intra-class scatter matrix for mining higher-order microbial module[C]//2019 IEEE International Conference on Bioinformatics and Biomedicine (BIBM). IEEE, 2019: 240-243.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2024 EAI Endorsed Transactions on e-Learning
This work is licensed under a Creative Commons Attribution 3.0 Unported License.
This is an open-access article distributed under the terms of the Creative Commons Attribution CC BY 4.0 license, which permits unlimited use, distribution, and reproduction in any medium so long as the original work is properly cited.