Skip to main content
Log in

SRENet: a spatiotemporal relationship-enhanced 2D-CNN-based framework for staging and segmentation of kidney cancer using CT images

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Kidney cancer (KC) is among the 10 most common cancers posing health threats to humans, with an average lifetime risk of 1.53%. Computed tomography (CT) is regarded as the golden standard for the characterization of KC and is widely used for KC prognosis. However, it is challenging to segment KC in CT images and perform cancer staging simultaneously due to the variable positions and shapes of kidney tumors and similar textural features in the background and target areas. We propose a novel spatiotemporal relationship-enhanced convolutional neural network (CNN)-based framework called SRENet. It consists of a spatial transformer framework and a residual U-Net with a temporal relationship extraction module for the staging and segmentation of KC. The SRENet achieves excellent performance on six evaluation metrics (Kappa, Sensitivity, Specificity, Precision, Accuracy, F1-score) for KC staging with an F1-score of 98.46%. The framework demonstrates a strong and reliable capacity for kidney and KC segmentation with Dice coefficients (DCs) of 97.89% and 92.54%, respectively, outperforming the state-of-the-art models (ResNeXt-101, ViT and Swin-Transformer for staging, and VNet, UNet 3+ and nnUNet for segmentation). The proposed SRENet helps accelerate the development of reliable kidney and KC segmentation methodologies and shows significant potential for KC diagnosis in clinical practice.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Data Availability

The CT data that support the findings of this study are available in KiTS19 with the identifier “https://doi.org/10.1016/j.media.2020.101821” [33].

References

  1. Drake CG, Lipson EJ, Brahmer JR (2014) Breathing new life into immunotherapy: review of melanoma, lung and kidney cancer. Nat Rev Clin Oncol 11(1):24

    Article  Google Scholar 

  2. Bray F, Ferlay J, Soerjomataram I, Siegel RL, Torre LA, Jemal A (2018) Global cancer statistics 2018: globocan estimates of incidence and mortality worldwide for 36 cancers in 185 countries. CA: a cancer journal for clinicians 68(6):394–424

    Google Scholar 

  3. Du Z, Chen W, Xia Q, Shi O, Chen Q (2020) Trends and projections of kidney cancer incidence at the global and national levels, 1990–2030: a bayesian age-period-cohort modeling study. Biomarker Research 8:1–10

    Article  Google Scholar 

  4. Patel HV, Srivastava A, Shinder B, Sadimin E, Singer EA (2019) Strengthening the foundation of kidney cancer treatment and research: revising the ajcc staging system. Annals of Translational Medicine, vol 7(Suppl 1)

  5. Society AC (2021) Kidney cancer stages. https://www.cancer.org/cancer/kidney-cancer/detection-diagnosis-staging/staging.html, Accessed May 1, 2021

  6. Institute NC (2021) Kidney and renal pelvis cancer. https://seer.cancer.gov/statfacts/html/kidrp.html, Accessed May 1, 2021

  7. Society AC (2022) Kidney cancer symptoms. https://www.cancer.org/cancer/kidney-cancer/detection-diagnosis-staging/signs-and-symptoms.html, Accessed September 16, 2022

  8. Society AC (2021) Kidney cancer stages. https://www.cancer.org/cancer/kidney-cancer/detection-diagnosis-staging/signs-and-symptoms.html, Accessed May 1, 2021

  9. Alnazer I, Bourdon P, Urruty T, Falou O, Khalil M, Shahin A, Fernandez-Maloigne C (2021) Recent advances in medical image processing for the evaluation of chronic kidney disease. Med Image Anal 69:101960

    Article  Google Scholar 

  10. van Oostenbrugge TJ, Fütterer JJ, Mulders PF (2018) Diagnostic imaging for solid renal tumors: a pictorial review. Kidney Cancer 2(2):79–93

    Article  Google Scholar 

  11. Boni E, Alfred C, Freear S, Jensen JA, Tortoli P (2018) Ultrasound open platforms for next-generation imaging technique development. IEEE Transactions on Ultrasonics Ferroelectrics, and Frequency Control 65(7):1078–1092

    Article  Google Scholar 

  12. Debette S, Schilling S, Duperron M-G, Larsson SC, Markus HS (2019) Clinical significance of magnetic resonance imaging markers of vascular brain injury: a systematic review and meta-analysis. JAMA Neurol 76(1):81–94

    Article  Google Scholar 

  13. He L, Yu H, Shi L, He Y, Geng J, Wei Y, Sun H, Chen Y (2018) Equity assessment of the distribution of ct and mri scanners in china: a panel data analysis. Int J Equity Health 17(1):1–10

    Article  Google Scholar 

  14. Khaing M, Saw YM, Than TM, Mon AM, Cho SM, Saw TN, Kariya T, Yamamoto E, Hamajima N (2020) Geographic distribution and utilisation of ct and mri services at public hospitals in myanmar. BMC Health Serv Res 20(1):1–14

    Article  Google Scholar 

  15. Beaulieu J, Dutilleul P (2019) Applications of computed tomography (ct) scanning technology in forest research: a timely update and review. Can J For Res 49(10):1173–1188

    Article  Google Scholar 

  16. Ljungberg B, Bensalah K, Canfield S, Dabestani S, Hofmann F, Hora M, Kuczyk MA, Lam T, Marconi L, Merseburger AS (2015) Eau guidelines on renal cell carcinoma: 2014 update. Eur Urol 67(5):913–924

    Article  Google Scholar 

  17. LeCun Y, Bengio Y, Hinton G (2015) Deep learning. Nature 521(7553):436–444

    Article  Google Scholar 

  18. Rawat W, Wang Z (2017) Deep convolutional neural networks for image classification: a comprehensive review. Neural Comput 29(9):2352–2449

    Article  MathSciNet  MATH  Google Scholar 

  19. Cai L, Gao J, Zhao D (2020) A review of the application of deep learning in medical image classification and segmentation. Annals of translational medicine, vol 8(11)

  20. Xie S, Girshick R, Dollár P, Tu Z, He K (2017) Aggregated residual transformations for deep neural networks. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 1492–1500

  21. Minaee S, Boykov YY, Porikli F, Plaza AJ, Kehtarnavaz N (2021) Image segmentation using deep learning, a survey. IEEE Transactions on Pattern Analysis and Machine Intelligence

  22. Tajbakhsh N, Jeyaseelan L, Li Q, Chiang JN, Wu Z, Ding X (2020) Embracing imperfect datasets: a review of deep learning solutions for medical image segmentation. Med Image Anal 63:101693

    Article  Google Scholar 

  23. Huang H, Lin L, Tong R, Hu H, Zhang Q, Iwamoto Y, Han X, Chen Y-W, Wu J (2020) Unet 3+A full-scale connected unet for medical image segmentation. In: ICASSP 2020-2020 IEEE international conference on acoustics, speech and signal processing (ICASSP): pp 1055–1059. IEEE

  24. Türk F, Lüy M (2020) Kidney and renal tumor segmentation using a hybrid v-net-based model. Mathematics 8(10):1772

    Article  Google Scholar 

  25. Couteaux V, Si-Mohamed S, Renard-Penna R, Nempont O, Lefevre T, Popoff A, Pizaine G, Villain N, Bloch I, Behr J (2019) Kidney cortex segmentation in 2d ct with u-nets ensemble aggregation. Diagn Interv Imaging 100(4):211–217

    Article  Google Scholar 

  26. Yu Q, Shi Y, Sun J, Gao Y, Zhu J, Dai Y (2019) Crossbar-net: a novel convolutional neural network for kidney tumor segmentation in ct images. IEEE Trans Image Process 28(8):4060–4074

    Article  MathSciNet  MATH  Google Scholar 

  27. Limkin EJ, Reuzé S., Carré A., Sun R, Schernberg A, Alexis A, Deutsch E, Ferté C, Robert C (2019) The complexity of tumor shape, spiculatedness, correlates with tumor radiomic shape features. Sci Rep 9(1):1–12

    Article  Google Scholar 

  28. Lin Z, Cui Y, Liu J, Sun Z, Ma S, Zhang X, Wang X (2021) Automated segmentation of kidney and renal mass and automated detection of renal mass in ct urography using 3d u-net-based deep convolutional neural network. Eur Radiol, pp 1–11

  29. Isensee F, Jaeger PF, Kohl SA, Petersen J, Maier-Hein KH (2021) nnu-net: a self-configuring method for deep learning-based biomedical image segmentation. Nat Methods 18(2):203– 211

    Article  Google Scholar 

  30. Dosovitskiy A, Beyer L, Kolesnikov A, Weissenborn D, Zhai X, Unterthiner T, Dehghani M, Minderer M, Heigold G, Gelly S et al (2020) An image is worth 16x16 words: transformers for image recognition at scale. arXiv:2010.11929

  31. Liu Z, Hu H, Lin Y, Yao Z, Xie Z, Wei Y, Ning J, Cao Y, Zhang Z, Dong L (2022) Swin transformer v2: scaling up capacity and resolution. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 12009–12019

  32. Achanta R, Shaji A, Smith K, Lucchi A, Fua P, Süsstrunk S (2012) Slic superpixels compared to state-of-the-art superpixel methods. IEEE Trans Pattern Anal Mach Intell 34(11):2274–2282

    Article  Google Scholar 

  33. Heller N, Isensee F, Maier-Hein KH, Hou X, Xie C, Li F, Nan Y, Mu G, Lin Z, Han M et al (2020) The state of the art in kidney and kidney tumor segmentation in contrast-enhanced ct imaging: results of the kits19 challenge. Med Image Anal 67:101821

    Article  Google Scholar 

Download references

Acknowledgements

The authors would like to thank the National Natural Science Foundation of China (Grant No. 61876059) for their support.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yu Gu.

Ethics declarations

Conflict of Interests

The authors declare no conflict of interest.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Shuang Liang and Yu Gu contributed equally to this work.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liang, S., Gu, Y. SRENet: a spatiotemporal relationship-enhanced 2D-CNN-based framework for staging and segmentation of kidney cancer using CT images. Appl Intell 53, 17061–17073 (2023). https://doi.org/10.1007/s10489-022-04384-5

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-022-04384-5

Keywords

Navigation