NaMa: Neighbor-Aware Multi-Modal Adaptive Learning for Prostate Tumor Segmentation on Anisotropic MR Images

Authors

  • Runqi Meng ShanghaiTech University
  • Xiao Zhang Northwest University; ShanghaiTech University
  • Shijie Huang ShanghaiTech University
  • Yuning Gu ShanghaiTech University
  • Guiqin Liu Shanghai Jiao Tong University School of Medicine
  • Guangyu Wu Shanghai Jiao Tong University School of Medicine
  • Nizhuan Wang ShanghaiTech University
  • Kaicong Sun ShanghaiTech University
  • Dinggang Shen ShanghaiTech University; Shanghai United Imaging Intelligence Co., Ltd.; Shanghai Clinical Research and Trial Center

DOI:

https://doi.org/10.1609/aaai.v38i5.28215

Keywords:

CV: Medical and Biological Imaging, CV: Segmentation

Abstract

Accurate segmentation of prostate tumors from multi-modal magnetic resonance (MR) images is crucial for diagnosis and treatment of prostate cancer. However, the robustness of existing segmentation methods is limited, mainly because these methods 1) fail to adaptively assess subject-specific information of each MR modality for accurate tumor delineation, and 2) lack effective utilization of inter-slice information across thick slices in MR images to segment tumor as a whole 3D volume. In this work, we propose a two-stage neighbor-aware multi-modal adaptive learning network (NaMa) for accurate prostate tumor segmentation from multi-modal anisotropic MR images. In particular, in the first stage, we apply subject-specific multi-modal fusion in each slice by developing a novel modality-informativeness adaptive learning (MIAL) module for selecting and adaptively fusing informative representation of each modality based on inter-modality correlations. In the second stage, we exploit inter-slice feature correlations to derive volumetric tumor segmentation. Specifically, we first use a Unet variant with sequence layers to coarsely capture slice relationship at a global scale, and further generate an activation map for each slice. Then, we introduce an activation mapping guidance (AMG) module to refine slice-wise representation (via information from adjacent slices) for consistent tumor segmentation across neighboring slices. Besides, during the network training, we further apply a random mask strategy to each MR modality to improve feature representation efficiency. Experiments on both in-house and public (PICAI) multi-modal prostate tumor datasets show that our proposed NaMa performs better than state-of-the-art methods.

Published

2024-03-24

How to Cite

Meng, R., Zhang, X., Huang, S., Gu, Y., Liu, G., Wu, G., Wang, N., Sun, K., & Shen, D. (2024). NaMa: Neighbor-Aware Multi-Modal Adaptive Learning for Prostate Tumor Segmentation on Anisotropic MR Images. Proceedings of the AAAI Conference on Artificial Intelligence, 38(5), 4198-4206. https://doi.org/10.1609/aaai.v38i5.28215

Issue

Section

AAAI Technical Track on Computer Vision IV