Abstract
The goal of dialogue topic shift detection is to identify whether the current topic in a conversation has changed or needs to change. Previous work focused on detecting topic shifts using pre-trained models to encode the utterance, failing to delve into the various levels of topic granularity in the dialogue and understand dialogue contents. To address the above issues, we take a prompt-based approach to fully extract topic information from dialogues at multiple-granularity, i.e., label, turn, and topic. Experimental results on our annotated Chinese Natural Topic Dialogue dataset CNTD and the publicly available English TIAGE dataset show that the proposed model outperforms the baselines. Further experiments show that the information extracted at different levels of granularity effectively helps the model comprehend the conversation topics.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
References
Dai, S., Wang, G., Park, S., Lee, S.: Dialogue response generation via contrastive latent representation learning. In: Proceedings of the 3rd Workshop on NLP for ConvAI, pp. 189–197 (2021)
Li, J., et al.: Dadgraph: a discourse-aware dialogue graph neural network for multiparty dialogue machine reading comprehension. In: Proceedings of IJCNN, pp. 1–8. IEEE (2021)
Li, Y., Zhao, H.: Self-and pseudo-self-supervised prediction of speaker and key-utterance for multi-party dialogue reading comprehension. In: Findings of EMNLP 2021, pp. 2053–2063 (2021)
Ghandeharioun, A., et al.: Approximating interactive human evaluation with self-play for open-domain dialog systems. In: Proceedings of NIPS, vol. 32 (2019)
Einolghozati, A., Gupta, S., Mohit, M., Shah, R.: Improving robustness of task oriented dialog systems. arXiv preprint arXiv:1911.05153 (2019)
Liu, B., Tür, G., Hakkani-Tür, D., Shah, P., Heck, L.: Dialogue learning with human teaching and feedback in end-to-end trainable task-oriented dialogue systems. In: Proceedings of NAACL-HLT, pp. 2060–2069 (2018)
Xing, L., Carenini, G.: Improving unsupervised dialogue topic segmentation with utterance-pair coherence scoring. In: Proceedings of SIGdial, pp. 167–177 (2021)
Hearst, M.A.: Text tiling: segmenting text into multi-paragraph subtopic passages. Comput. Linguist. 23(1), 33–64 (1997)
Xie, H., Liu, Z., Xiong, C., Liu, Z., Copestake, A.: Tiage: a benchmark for topic-shift aware dialog modeling. In: Findings of EMNLP 2021, pp. 1684–1690 (2021)
Raffel, C., et al.: Exploring the limits of transfer learning with a unified text-to-text transformer. J. Mach. Learn. Res. 21(140), 1–67 (2020)
Xu, Y., Zhao, H., Zhang, Z.: Topic-aware multi-turn dialogue modeling. In: Proceedings of the AAAI, pp. 14176–14184 (2021)
Lin, J., et al.: Topic shift detection in Chinese dialogues: corpus and benchmark. arXiv preprint arXiv:2305.01195 (2023)
Wang, X., Li, C., Zhao, J., Yu, D.: Naturalconv: a chinese dialogue dataset towards multi-turn topic-driven conversation. In: Proceedings of the AAAI, pp. 14006–14014 (2021)
Zhang, S., et al.: Personalizing dialogue agents: I have a dog, do you have pets too? In: Proceedings of ACL, pp. 2204–2213 (2018)
Budzianowski, P., et al.: Multiwoz-a large-scale multi-domain wizard-of-oz dataset for task-oriented dialogue modelling. In: Proceedings of EMNLP, pp. 5016–5026 (2018)
Eric, M., Krishnan, L., Charette, F., Manning, C.D.: Key-value retrieval networks for task-oriented dialogue. In: Proceedings of SIGdial, pp. 37–49 (2017)
Eisenstein, J., Barzilay, R.: Bayesian unsupervised topic segmentation. In: Proceedings of EMNLP, pp. 334–343 (2008)
Du, L., Buntine, W., Johnson, M.: Topic segmentation with a structured topic model. In: Proceedings of NAACL-HLT, pp. 190–200 (2013)
Badjatiya, P., Kurisinkel, L.J., Gupta, M., Varma, V.: Attention-based neural text segmentation. In: Pasi, G., Piwowarski, B., Azzopardi, L., Hanbury, A. (eds.) ECIR 2018. LNCS, vol. 10772, pp. 180–193. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-76941-7_14
Arnold, S., Schneider, R., Cudré-Mauroux, P., Gers, F.A., Loser, A.: Sector: a neural model for coherent topic segmentation and classification. Trans. Assoc. Comput. Linguist. 7, 169–184 (2019)
Grootendorst, M.: Keybert: minimal keyword extraction with Bert (2020)
Xue, L., et al.: mt5: a massively multilingual pre-trained text-to-text transformer. In: Proceedings of NAACL-HLT, pp. 483–498 (2021)
Liu, Y., et al.: Roberta: a robustly optimized BERT pretraining approach. arXiv preprint arXiv:1907.11692 (2019)
Lukasik, M., Dadachev, B., Papineni, K., Simoes, G.: Text segmentation by cross segment attention. In: Proceedings of EMNLP, pp. 4707–4716 (2020)
Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: Bert: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of NAACL-HLT, pp. 4171–4186 (2019)
Acknowledgements
The authors would like to thank the three anonymous reviewers for their comments on this paper. This research was supported by the National Natural Science Foundation of China (Nos. 62276177, and 61836007), and Project Funded by the Priority Aca-demic Program Development of Jiangsu Higher Education Institutions (PAPD).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Lin, J., Fan, Y., Chu, X., Li, P., Zhu, Q. (2023). Multi-granularity Prompts for Topic Shift Detection in Dialogue. In: Huang, DS., Premaratne, P., Jin, B., Qu, B., Jo, KH., Hussain, A. (eds) Advanced Intelligent Computing Technology and Applications. ICIC 2023. Lecture Notes in Computer Science(), vol 14089. Springer, Singapore. https://doi.org/10.1007/978-981-99-4752-2_42
Download citation
DOI: https://doi.org/10.1007/978-981-99-4752-2_42
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-4751-5
Online ISBN: 978-981-99-4752-2
eBook Packages: Computer ScienceComputer Science (R0)