ABSTRACT
Script event prediction is a big challenge and its goal is to predict the subsequent event based on the observed events. Since an event is described by text, the pre-trained models have been applied for event representation. However, the embedding based on the pre-trained models is sensitive to the short text format of events, and the existing works do not handle it well. In addition, previous models pay more attention to the semantic similarity but ignore the factors of emergencies. The turning event at the tail of the event chain can easily affect the follow-up direction. This paper proposes a new preprocessing method: cleaning, alignment, and connection, which helps to obtain richer event representations. On this basis, we concatenate the embedding of the CLS token and event sequence to integrate the semantic and temporal features of the event chain. To deal with the problem of event turning, we propose a tail event enhancement module. It adds the transition probability of tail events and candidate events into prediction layer, so as to avoid pay only attention to the semantic feature. The results of a large number of comparative experiments and ablation experiments confirm the superiority of our model compared with the baselines.
- Nathanael Chambers and Dan Jurafsky. 2008. Unsupervised learning of narrative event chains. In Proceedings of ACL-08: HLT. 789–797.Google Scholar
- Xiao Ding, Kuo Liao, Ting Liu, Zhongyang Li, and Junwen Duan. 2019. Event representation learning enhanced with external commonsense knowledge. arXiv preprint arXiv:1909.05190 (2019).Google Scholar
- Mark Granroth-Wilding and Stephen Clark. 2016. What Happens Next? Event Prediction Using a Compositional Neural Network Model. In AAAI. AAAI Press, 2727–2733.Google Scholar
- Yi Han, Linbo Qiao, Jianming Zheng, Hefeng Wu, Dongsheng Li, and Xiangke Liao. 2021. A survey of script learning. Frontiers Inf. Technol. Electron. Eng. volume22, number3 (2021), 341–373.Google Scholar
- Gao Huang, Zhuang Liu, Laurens van der Maaten, and Kilian Q. Weinberger. 2017. Densely Connected Convolutional Networks. In 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). 2261–2269.Google Scholar
- Hiroshi Inoue. 2019. Multi-Sample Dropout for Accelerated Training and Better Generalization. CoRR volumeabs/1905.09788 (2019).Google Scholar
- Bram Jans, Steven Bethard, Ivan Vulic, and Marie-Francine Moens. 2012. Skip N-grams and Ranking Functions for Predicting Script Events. In EACL. The Association for Computer Linguistics, 336–344.Google Scholar
- Diederik P. Kingma and Jimmy Ba. 2014. Adam: A Method for Stochastic Optimization. arXiv preprint arXiv:1412.6980 (2014).Google Scholar
- I-Ta Lee and Dan Goldwasser. 2018. FEEL: Featured Event Embedding Learning. In AAAI. AAAI Press, 4840–4847.Google Scholar
- I-Ta Lee and Dan Goldwasser. 2019. Multi-relational script learning for discourse relations. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. 4214–4226.Google ScholarCross Ref
- Zhongyang Li, Xiao Ding, and Ting Liu. 2018. Constructing narrative event evolutionary graph for script event prediction. arXiv preprint arXiv:1805.05081 (2018).Google ScholarDigital Library
- Yinhan Liu, Myle Ott, Naman Goyal, Jingfei Du, Mandar Joshi, Danqi Chen, Omer Levy, Mike Lewis, Luke Zettlemoyer, and Veselin Stoyanov. 2019. RoBERTa: A Robustly Optimized BERT Pretraining Approach. CoRR volumeabs/1907.11692 (2019).Google Scholar
- Shangwen Lv, Wanhui Qian, Longtao Huang, Jizhong Han, and Songlin Hu. 2019. SAM-Net: Integrating Event-Level and Chain-Level Attentions to Predict What Happens Next. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. volume33. 6802–6809.Google ScholarDigital Library
- Shangwen Lv, Fuqing Zhu, and Songlin Hu. 2020. Integrating External Event Knowledge for Script Learning. In Proceedings of the 28th International Conference on Computational Linguistics. 306–315.Google ScholarCross Ref
- Tomas Mikolov, Kai Chen, Greg Corrado, and Jeffrey Dean. 2013. Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781 (2013).Google Scholar
- Jeffrey Pennington, Richard Socher, and Christopher D. Manning. 2014. Glove: Global Vectors for Word Representation. In EMNLP. ACL, 1532–1543.Google Scholar
- Bryan Perozzi, Rami Al-Rfou, and Steven Skiena. 2014. Deepwalk: Online learning of social representations. In Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining. 701–710.Google ScholarDigital Library
- Karl Pichotta and Raymond Mooney. 2014. Statistical script learning with multi-argument events. In Proceedings of the 14th Conference of the European Chapter of the Association for Computational Linguistics. 220–229.Google ScholarCross Ref
- Karl Pichotta and Raymond J Mooney. 2016. Learning Statistical Scripts with LSTM Recurrent Neural Networks.. In AAAI. 2800–2806.Google Scholar
- Rashmi Prasad, Nikhil Dinesh, Alan Lee, Eleni Miltsakaki, Livio Robaldo, Aravind K. Joshi, and Bonnie L. Webber. 2008. The Penn Discourse TreeBank 2.0. In LREC. European Language Resources Association.Google Scholar
- Rachel Rudinger, Pushpendre Rastogi, Francis Ferraro, and Benjamin Van Durme. 2015. Script Induction as Language Modeling. In EMNLP. The Association for Computational Linguistics, 1681–1686.Google Scholar
- Maarten Sap, Ronan Le Bras, Emily Allaway, Chandra Bhagavatula, Nicholas Lourie, Hannah Rashkin, Brendan Roof, Noah A. Smith, and Yejin Choi. 2019. ATOMIC: An Atlas of Machine Commonsense for If-Then Reasoning. In AAAI. AAAI Press, 3027–3035.Google Scholar
- Zhongqing Wang, Yue Zhang, and Ching Yun Chang. 2017. Integrating order information and event relation for script event prediction. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. 57–67.Google ScholarCross Ref
- Noah Weber, Niranjan Balasubramanian, and Nathanael Chambers. 2018. Event Representations With Tensor-Based Compositions. In AAAI. AAAI Press, 4946–4953.Google Scholar
- Hongzuo Xu, Yijie Wang, Songlei Jian, Zhenyu Huang, Yongjun Wang, Ning Liu, and Fei Li. 2021. Beyond Outlier Detection: Outlier Interpretation by Attention-Guided Triplet Deviation Network. In WWW. ACM / IW3C2, 1328–1339.Google Scholar
- Hongming Zhang, Xin Liu, Haojie Pan, Yangqiu Song, and Cane Wing-Ki Leung. 2020. ASER: A large-scale eventuality knowledge graph. In Proceedings of The Web Conference 2020. 201–211.Google ScholarDigital Library
- Jianming Zheng, Fei Cai, Yanxiang Ling, and Honghui Chen. 2020. Heterogeneous Graph Neural Networks to Predict What Happen Next. In Proceedings of the 28th International Conference on Computational Linguistics. 328–338.Google ScholarCross Ref
Index Terms
- Script event prediction based on pre-trained model with tail event enhancement
Recommendations
Script Event Prediction via Multilingual Event Graph Networks
Predicting what happens next in text plays a critical role in building NLP applications. Many methods including count-based and neural-network-based have been proposed to tackle the task called script event prediction: predicting the most suitable ...
Multi-level Connection Enhanced Representation Learning for Script Event Prediction
WWW '21: Proceedings of the Web Conference 2021Script event prediction (SEP) aims to choose a correct subsequent event from a candidate list, given a chain of ordered context events. Event representation learning has been proposed and successfully applied to this task. Most previous methods ...
Improving Event Representation for Script Event Prediction via Data Augmentation and Integration
Natural Language Processing and Chinese ComputingAbstractScript event prediction aims to predict the most likely following events, given the historical events in the script. This task requires the capability to learn more information between events. Most previous methods mainly focused on the current ...
Comments