ABSTRACT
Node embedding is one of the most widely adopted techniques in numerous graph analysis tasks, such as node classification. Methods for node embedding can be broadly classified into three categories: proximity matrix factorization approaches, sampling methods, and deep learning strategies. Among the deep learning strategies, graph contrastive learning has attracted significant interest. Yet, it has been observed that existing graph contrastive learning approaches do not adequately preserve the local topological structure of the original graphs, particularly when neighboring nodes belong to disparate categories. To address this challenge, this paper introduces a novel node embedding approach named Locally Linear Contrastive Embedding (LLaCE). LLaCE is designed to maintain the intrinsic geometric structure of graph data by utilizing locally linear formulation, thereby ensuring that the local topological characteristics are accurately reflected in the embedding space. Experimental results on one synthetic dataset and five real-world datasets validate the effectiveness of our proposed method.
Supplemental Material
- J. Bruna, W. Zaremba, A. Szlam, and Y. LeCun. 2013. Spectral networks and locally connected networks on graphs. arXiv preprint arXiv:1312.6203 (2013).Google Scholar
- J. Chen and G. Kou. 2023. Attribute and structure preserving graph contrastive learning. In AAAI, Vol. 37. 7024--7032.Google ScholarDigital Library
- A. Grover and J. Leskovec. 2016. node2vec: Scalable feature learning for networks. In SIGKDD. 855--864.Google Scholar
- W. Hamilton, Z. Ying, and J. Leskovec. 2017. Inductive representation learning on large graphs. NIPS , Vol. 30 (2017).Google Scholar
- D. Jin, B. Zhang, Y. Song, D. He, Z. Feng, S. Chen, W. Li, and K. Musial. 2020. ModMRF: A modularity-based Markov Random Field method for community detection. Neurocomputing , Vol. 405 (2020), 218--228.Google ScholarCross Ref
- W. Liu, Y. Zhang, J. Wang, Y. He, J. Caverlee, P. P. K. Chan, D. S. Yeung, and P. A. Heng. 2021. Item relationship graph neural networks for e-commerce. IEEE Trans. Neural Netw. Learn. Syst. , Vol. 33, 9 (2021), 4785--4799.Google ScholarCross Ref
- T. Mikolov, K. Chen, G. Corrado, and J. Dean. 2013. Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781 (2013).Google Scholar
- B. Perozzi, R. Al-Rfou, and S. Skiena. 2014. Deepwalk: Online learning of social representations. In SIGKDD. 701--710.Google ScholarDigital Library
- A. Rossi, D. Barbosa, D. Firmani, A. Matinata, and P. Merialdo. 2021. Knowledge graph embedding for link prediction: A comparative analysis. TKDD, Vol. 15, 2 (2021), 1--49.Google Scholar
- S. T. Roweis and L. K. Saul. 2000. Nonlinear dimensionality reduction by locally linear embedding. Science, Vol. 290, 5500 (2000), 2323--2326.Google ScholarCross Ref
- H. Sun, F. He, J. Huang, Y. Sun, Y. Li, C. Wang, L. He, Z. Sun, and X. Jia. 2020. Network embedding for community detection in attributed networks. TKDD, Vol. 14, 3 (2020), 1--25.Google Scholar
- L. Torres, K. S. Chan, and T. Eliassi-Rad. 2020. GLEE: geometric Laplacian eigenmap embedding. J. Complex Netw. , Vol. 8, 2 (2020), cnaa007.Google Scholar
- K. Xu, C. Li, Y. Tian, T. Sonobe, K. Kawarabayashi, and S. Jegelka. 2018. Representation learning on graphs with jumping knowledge networks. In ICML. 5453--5462.Google Scholar
- H. Zhang, Q. Wu, Y. Wang, S. Zhang, J. Yan, and P. S. Yu. 2022. Localized Contrastive Learning on Graphs. arXiv preprint arXiv:2212.04604 (2022).Google Scholar
- X. Zhang, K. Xie, S. Wang, and Z. Huang. 2021. Learning based proximity matrix factorization for node embedding. In SIGKDD. 2243--2253.Google Scholar
- H. Zhu and P. Koniusz. 2022. Generalized Laplacian Eigenmaps. NeurIPS , Vol. 35 (2022), 30783--30797.Google Scholar
- J. Zhu, Y. Yan, L. Zhao, M. Heimann, L. Akoglu, and D. Koutra. 2020. Beyond homophily in graph neural networks: Current limitations and effective designs. NeurIPS , Vol. 33 (2020), 7793--7804. ioGoogle Scholar
Index Terms
- LLaCE: Locally Linear Contrastive Embedding
Recommendations
Space-invariant projection in streaming network embedding
AbstractThe introduction of new nodes in dynamic networks gradually leads to drift in the node embedding space, and the retraining of node embeddings and downstream models is indispensable. The exact threshold of these new nodes, below which ...
MG2Vec+: A multi-headed graph attention network for multigraph embedding
AbstractRepresentation learning of graphs in the form of graph embeddings is an extensively studies area, especially for simple networks, to help with different downstream applications such as node clustering, link prediction, and node classification. In ...
Improving node embedding by a compact neighborhood representation
AbstractGraph Embedding, a learning paradigm that represents graph vertices, edges, and other semantic information about a graph into low-dimensional vectors, has found wide applications in different machine learning tasks. In the past few years, we have ...
Comments