Skip to main content

Inductive Light Graph Convolution Network for Text Classification Based on Word-Label Graph

  • Conference paper

Part of the book series: IFIP Advances in Information and Communication Technology ((IFIPAICT,volume 643))

Abstract

Nowadays, Graph Convolution Networks (GCNs) have flourished in the field of text classification, such as Text Graph Convolution Network (TextGCN). But good performance of those methods is based on building a graph whose nodes consist of an entire corpus, making their models transductive. Meanwhile rich label information has not been utilized in the graph structure. In this paper, we propose a new model named Inductive Light Graph Convolution Networks (ILGCN) with a new construction of graph. This approach uses labels and words to build the graph which removes the dependence between an individual text and entire corpus, and let ILGCN inductive. Besides, we simplify the model structure and only remain the neighborhood aggregation, which is the most important part of GCNs. Experiments on multiple benchmark show that our model outperforms existing state-of-the-art models on several text classification datasets.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   119.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   159.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   159.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    https://www.cs.umb.edu/smimarog/textmining/datasets/.

  2. 2.

    http://disi.unitn.it/moschitti/corpora.htm.

  3. 3.

    https://github.com/mnqu/PTE/tree/master/data/mr.

References

  1. Jindal, N., Liu, B.: Review spam detection. In: Proceedings of the 16th International Conference on World Wide Web (2007)

    Google Scholar 

  2. Zeng, Z., Deng, Yu., Li, X., Naumann, T., Luo, Y.: Natural language processing for EHR-based computational phenotyping. IEEE/ACM Trans. Comput. Biol. Bioinf. 16(1), 139–153 (2019)

    Article  Google Scholar 

  3. Joachims, T.: A probabilistic analysis of the Rocchio algorithm with TFIDF for text categorization. Department of Computer Science, Carnegie-Mellon University, Pittsburgh, PA (1996)

    Google Scholar 

  4. Zhang, Y., Jin, R., Zhou, Z.-H.: Understanding bag-of-words model: a statistical framework. Int. J. Mach. Learn. Cybern. 1(1–4), 43–52 (2010)

    Article  Google Scholar 

  5. Wang, S.I., Manning, C.D.: Baselines and bigrams: simple, good sentiment and topic classification. In: Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers) (2012)

    Google Scholar 

  6. Kim, Y.: Convolutional neural networks for sentence classification. Eprint arXiv (2014)

    Google Scholar 

  7. Kalchbrenner, N., Grefenstette, E., Blunsom, P.: A convolutional neural network for modelling sentences. arXiv preprint arXiv:1404.2188 (2014)

  8. Gehring, J., et al.: Convolutional sequence to sequence learning. In: International Conference on Machine Learning (PMLR) (2017)

    Google Scholar 

  9. Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907 (2016)

  10. Veličković, P., et al.: Graph attention networks. arXiv preprint arXiv:1710.10903 (2017)

  11. Hamilton, W.L., Ying, R., Leskovec, J.: Inductive representation learning on large graphs. arXiv preprint arXiv:1706.02216 (2017)

  12. Defferrard, M., Bresson, X., Vandergheynst, P.: Convolutional neural networks on graphs with fast localized spectral filtering. arXiv preprint arXiv:1606.09375 (2016)

  13. Yao, L., Mao, C., Luo, Y.: Graph convolutional networks for text classification. Proc. AAAI Conf. Artif. Intell. 33, 7370–7377 (2019)

    Google Scholar 

  14. Wu, F., et al.: Simplifying graph convolutional networks. In: International Conference on Machine Learning (PMLR) (2019)

    Google Scholar 

  15. Zhu, H., Koniusz, P.: Simple spectral graph convolution. In: International Conference on Learning Representations (2021)

    Google Scholar 

  16. He, X., et al.: LightGCN: simplifying and powering graph convolution network for recommendation. In: Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval (2020)

    Google Scholar 

  17. Wang, X., et al.: Neural graph collaborative filtering. In: Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval (2019)

    Google Scholar 

  18. Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)

  19. Bruna, J., et al.: Spectral networks and locally connected networks on graphs. arXiv preprint arXiv:1312.6203 (2013)

  20. Henaff, M., Bruna, J., LeCun, Y.: Deep convolutional networks on graph-structured data. arXiv preprint arXiv:1506.05163 (2015)

  21. Micheli, A.: Neural network for graphs: a contextual constructive approach. IEEE Trans. Neural Netw. 20(3), 498–511 (2009)

    Article  MathSciNet  Google Scholar 

  22. Atwood, J., Towsley, D.: Diffusion-convolutional neural networks. In: Advances in Neural Information Processing Systems (2016)

    Google Scholar 

  23. Li, Y., et al.: Diffusion convolutional recurrent neural network: data-driven traffic forecasting. arXiv preprint arXiv:1707.01926 (2017)

  24. Monti, F., et al.: Geometric deep learning on graphs and manifolds using mixture model CNNs. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2017)

    Google Scholar 

  25. Xu, K., et al.: How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018)

  26. Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. Proc. AAAI Conf. Artif. Intell. 34(04), 3438–3445 (2020)

    Google Scholar 

  27. Ma, J., et al.: Disentangled graph convolutional networks. In: International Conference on Machine Learning (PMLR) (2019)

    Google Scholar 

  28. Wang, X., et al.: Disentangled graph collaborative filtering. In: Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval (2020)

    Google Scholar 

  29. Burges, C.J.C.: A tutorial on support vector machines for pattern recognition. Data Min. Knowl. Disc. 2(2), 121–167 (1998)

    Article  Google Scholar 

  30. Leung, K.M.: Naive Bayesian classifier, pp. 123–156. Polytechnic University Department of Computer Science/Finance and Risk Engineering (2007)

    Google Scholar 

  31. Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE Trans. Inf. Theor. 13(1), 21–27 (1967)

    Article  Google Scholar 

  32. Vaswani, A., et al.: Attention is all you need. arXiv preprint arXiv:1706.03762 (2017)

  33. Devlin, J., et al.: Bert: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)

Download references

Acknowledgment

This work is supported by National Key Research and Development Project (2018YFE0119700), Key Research and Development Project of Shandong Province (2019JZZY010132, 2019-0101), the Natural Foundation of Shandong Province (ZR2018MF003), Plan of Youth Innovation Team Development of colleges and universities in Shandong Province (SD2019-161).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiaoming Wu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 IFIP International Federation for Information Processing

About this paper

Cite this paper

Shi, J., Wu, X., Liu, X., Lu, W., Li, S. (2022). Inductive Light Graph Convolution Network for Text Classification Based on Word-Label Graph. In: Shi, Z., Zucker, JD., An, B. (eds) Intelligent Information Processing XI. IIP 2022. IFIP Advances in Information and Communication Technology, vol 643. Springer, Cham. https://doi.org/10.1007/978-3-031-03948-5_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-03948-5_4

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-03947-8

  • Online ISBN: 978-3-031-03948-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics