Skip to main content

Graph-Enhanced Document Representation for Court Case Retrieval

  • Conference paper
  • First Online:
Advances in Information Retrieval (ECIR 2022)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13186))

Included in the following conference series:

  • 2459 Accesses

Abstract

To reach informed decisions, legal domain experts in Civil Law systems need to have knowledge not only about legal paragraphs, but also about related court cases. However, court case retrieval is challenging due to the domain-specific language and large document sizes. While modern transformer models such as BERT create dense text representations suitable for efficient retrieval in many domains, without domain specific adaptions they are outperformed by established lexical retrieval models in the legal domain. Although citations of court cases and codified law play an important role in the domain, there has been little research on utilizing a combination of text representations and citation graph data for court case retrieval. In other domains, attempts have been made to combine these two with methods such as concatenating graph embeddings to text embeddings. In the PhD research project, domain-specific challenges of legal retrieval systems will be tackled. To help with this task, a dataset of Austrian court cases, their document labels as well as their citations of other court cases and codified law on a document and paragraph level will be created and made public. Experiments in this project will include various ways of enhancing transformer-based text representations methods with citation graph data, such as graph based transformer re-training or graph embeddings.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://sites.ualberta.ca/~rabelo/COLIEE2021/.

  2. 2.

    https://digitales.wien.gv.at/projekt/brisevienna/.

  3. 3.

    https://sites.ualberta.ca/~rabelo/COLIEE2021/.

  4. 4.

    https://digitales.wien.gv.at/projekt/brisevienna/.

  5. 5.

    https://www.ris.bka.gv.at/Vwgh/.

  6. 6.

    https://sites.ualberta.ca/~rabelo/COLIEE2021/.

References

  1. Althammer, S., Askari, A., Verberne, S., Hanbury, A.: DoSSIER@ COLIEE 2021: leveraging dense retrieval and summarization-based re-ranking for case law retrieval. arXiv preprint arXiv:2108.03937 (2021)

  2. Beltagy, I., Lo, K., Cohan, A.: SciBERT: a pretrained language model for scientific text. arXiv preprint arXiv:1903.10676 (2019)

  3. Beltagy, I., Peters, M.E., Cohan, A.: Longformer: the long-document transformer. arXiv preprint arXiv:2004.05150 (2020)

  4. Bhattacharya, P., et al.: Overview of the FIRE 2020 AILA track: artificial intelligence for legal assistance. In: Proceedings of FIRE 2020 - Forum for Information Retrieval Evaluation, Hyderabad, India, December 2020

    Google Scholar 

  5. Bojanowski, P., Grave, E., Joulin, A., Mikolov, T.: Enriching word vectors with subword information. Trans. Assoc. Comput. Linguist. 5, 135–146 (2017)

    Article  Google Scholar 

  6. Brown, T.B., et al.: Language models are few-shot learners. arXiv preprint arXiv:2005.14165 (2020)

  7. Cohan, A., Feldman, S., Beltagy, I., Downey, D., Weld, D.S.: Specter: document-level representation learning using citation-informed transformers. arXiv preprint arXiv:2004.07180 (2020)

  8. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)

  9. Fink, T., Recski, G., Hanbury, A.: FIRE2020 AILA track: legal domain search with minimal domain knowledge. In: FIRE (Working Notes), pp. 76–81 (2020)

    Google Scholar 

  10. Hamilton, W.L., Ying, R., Leskovec, J.: Inductive representation learning on large graphs. In: Proceedings of the 31st International Conference on Neural Information Processing Systems, pp. 1025–1035 (2017)

    Google Scholar 

  11. Hofstätter, S., Zamani, H., Mitra, B., Craswell, N., Hanbury, A.: Local self-attention over long text for efficient document retrieval. In: Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 2021–2024 (2020)

    Google Scholar 

  12. Jeh, G., Widom, J.: SimRank: a measure of structural-context similarity. In: Proceedings of the Eighth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 538–543 (2002)

    Google Scholar 

  13. Jeong, C., Jang, S., Park, E., Choi, S.: A context-aware citation recommendation model with BERT and graph convolutional networks. Scientometrics 124(3), 1907–1922 (2020). https://doi.org/10.1007/s11192-020-03561-y

    Article  Google Scholar 

  14. Karpukhin, V., et al.: Dense passage retrieval for open-domain question answering. arXiv preprint arXiv:2004.04906 (2020)

  15. Kitaev, N., Kaiser, Ł., Levskaya, A.: Reformer: the efficient transformer. arXiv preprint arXiv:2001.04451 (2020)

  16. Leburu-Dingalo, T., Motlogelwa, N.P., Thuma, E., Modongo, M.: UB at FIRE 2020 precedent and statute retrieval. In: FIRE (Working Notes), pp. 12–17 (2020)

    Google Scholar 

  17. Li, Z., Kong, L.: Language model-based approaches for legal assistance. In: FIRE (Working Notes), pp. 49–53 (2020)

    Google Scholar 

  18. Liu, L., Liu, L., Han, Z.: Query revaluation method for legal information retrieval. In: FIRE (Working Notes), pp. 18–21 (2020)

    Google Scholar 

  19. Ma, Y., Shao, Y., Liu, B., Liu, Y., Zhang, M., Ma, S.: Retrieving Legal Cases from a Large-scale Candidate Corpus (2021)

    Google Scholar 

  20. Mikolov, T., Sutskever, I., Chen, K., Corrado, G.S., Dean, J.: Distributed representations of words and phrases and their compositionality. In: Advances in Neural Information Processing Systems, pp. 3111–3119 (2013). http://papers.nips.cc/paper/5021-distributed-representations

  21. Passant, A.: Measuring semantic distance on linking data and using it for resources recommendations. In: 2010 AAAI Spring Symposium Series (2010)

    Google Scholar 

  22. Pennington, J., Socher, R., Manning, C.: GloVe: global vectors for word representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1532–1543 (2014)

    Google Scholar 

  23. Peters, M.E., et al.: Deep contextualized word representations. arXiv preprint arXiv:1802.05365 (2018)

  24. Rosa, G.M., Rodrigues, R.C., Lotufo, R., Nogueira, R.: Yes, BM25 is a strong baseline for legal case retrieval. arXiv preprint arXiv:2105.05686 (2021)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tobias Fink .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Fink, T. (2022). Graph-Enhanced Document Representation for Court Case Retrieval. In: Hagen, M., et al. Advances in Information Retrieval. ECIR 2022. Lecture Notes in Computer Science, vol 13186. Springer, Cham. https://doi.org/10.1007/978-3-030-99739-7_59

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-99739-7_59

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-99738-0

  • Online ISBN: 978-3-030-99739-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics