Japanese Journal of
Gastroenterology Research


Review Article - Open Access, Volume 2

The study progress of relationship between upper gastrointestinal endoscopy and artificial intelligence-aided diagnosis in early gastric cancer

Qinchuan Yang1; Bo Shan2; Di Tang3; Juan Yu1; Haikun Zhou1; Chao Yue1; Weidong Wang1; Ruiqi Gao1; Zhenchang Mo1; Panpan Ji1; Ying Zhang4*; Gang Ji1*; Xiaohua Li1*

1Department of Gastrointestinal Surgery, Xijing Hospital, Air Force Military Medical University, Xi’an 710032, Shaanxi, China.

2Department of Ultrasound, Xijing Hospital, Air Force Military Medical University, Xi’an 710032, Shaanxi, China.

3The Second Brigade of Cadet, School of Basic Medicine, Air Force Military Medical University, Xi’an 710032, Shaanxi, China.

4Department of Radiotherapy, Xijing Hospital, Air Force Military Medical University, Xi’an 710032, Shaanxi, China.

*Corresponding Author : Ying Zhang, Gang Ji, Xiaohua Li
Department of Gastrointestinal Surgery, Xijing Hospital, Air Force Military Medical University, Xi’an 710032, Shaanxi, China.
Email: 270346831@qq.com, Jigang@fmmu.edu.cn, xjyylixiaohua@163.com

Received : Oct 26, 2022

Accepted : Nov 11, 2022

Published : Nov 17, 2022

Archived : www.jjgastro.com

Copyright : © Zhang Y, Ji G, Li X (2022).

Abstract

Gastric cancer was the third most deadly cancer in the world. However, the 5-year survival rate of patients, who was diagnosed with early gastric cancer for the first time, were over 90% after regular treatment. Therefore, it is important to improve the diagnostic performance of gastric cancer through effective early screening means, and then take timely interventions to achieve secondary prevention. Upper gastrointestinal endoscopy (UGIE) played a crucial role in the diagnosis of early gastric cancer. UGIE mainly relied on the operator’s technique and diagnostic experience, etc. Coupled with the large workload of image analysis, misdiagnosis and missed diagnosis were inevitable even for experienced endoscopists. In recent years, with the in-depth research of artificial intelligence theory, machine learning had achieved extensive research in the field of gastric cancer with its efficient computing power and learning ability. Modern computer vision algorithms were applied to the processing of gastroscopic images to achieve automatic lesion annotation, feature extraction and conversion, and assisted diagnosis. It was promising to help endoscopists in the detection and screening of early cancerous lesions. This review summarized the application of various current gastroscopic imaging techniques and algorithms in UGIE, with a view to providing new ideas for the application of artificial intelligence-assisted UGIE in early gastric cancer diagnosis.

Keywords: Artificial intelligence; Machine learning; Deep learning; Convolutional neural networks; Early gastric cancer; Overview.

Citation: Yang Q, Shan B, Zhang Y, Ji G, Li X, et al. The study progress of relationship between upper gastrointestinal endoscopy and artificial intelligence-aided diagnosis in early gastric cancer. Japanese J Gastroenterol Res. 2022; 2(15): 1119.

Introduction

According to Global Cancer Statistics 2018, the incidence of gastric cancer ranked fifth and the fatality rate third [1]. The prognosis of gastric cancer patients depends more on the stage at the time of diagnosis. For those who are first diagnosed with early gastric cancer, their 5-year survival rate exceeds 90% after regular treatment [2]; while patients with advanced gastric cancer have a poorer prognosis. Therefore, for the management of gastric cancer, the focus is on how to improve the diagnostic performance of gastric cancer through effective early screening tools, and then take timely interventions to achieve secondary prevention and improve the prognosis of patients’ survival.

Esophagogastroduodenoscopy (EGD), also known as Upper Gastrointestinal (UGI) endoscopy (UGIE), allows direct internal visualization of all parts of the upper gastrointestinal tract, from the proximal esophagus to the duodenal jugular. It is the preferred method in diagnosis of UGI diseases and plays a crucial role in the early diagnosis of gastric cancer. In daily practice, the quality of UGIE mainly depends on the instrumentation, operator’s technique and diagnostic experience [3]. Therefore, it will result in a large workload of image analysis, and even experienced endoscopists will inevitably make misdiagnosis and missed diagnosis. Therefore, there is sometimes heterogeneity in diagnostic results, especially for the differential diagnosis of GC and gastritis, where the rate of missed diagnosis could sometimes be as high as 20~40% [4-6]. 

In recent years, with the in-depth research of Artificial Intelligence (AI) theory, machine learning has been widely applied in the field of gastric cancer with its efficient computing power and learning ability. In particular, the deep learning mode, which applies modern computer vision algorithms to the processing of gastroscopic images, has achieved automatic lesion labelling, feature extraction and transformation, and assisted diagnosis. This computer-aided diagnosis (CAD) technique is conducive to helping endoscopists in the detection and screening of early gastric cancer.

This review summarizes the application and progress of AI-assisted UGIE in the diagnosis of early gastric cancer and its significance in guiding clinical diagnosis and treatment, aiming to provide new ideas for the application of AI-assisted UGIE in the diagnosis of early gastric cancer and a reference for the early prevention, preliminary screening and clinical treatment of gastric cancer.

Artificial intelligence

AI refers to machines that could perform complex tasks like humans by mimicking cognitive functions, such as learning and problem solving intelligently. It was first introduced in 1955 and had been rapidly applied in the field of medicine [7].

Machine learning (ML) is an area of artificial intelligence. It allows the integration of large amounts of data and algorithms into machines, which then form analytical models by automatically learning the input data. Machine learning algorithms mainly include decision tree, random forest, logistic regression, support vector machines, naive bayesian, k-nearest neighbor’s algorithm, k-means clustering, adaboost algorithm, neural networks, markov, etc [8]. 

Deep Learning (DL), also called “modified neural network” algorithm, is a special kind of machine learning model by imitating the pattern of passing between neurons and processing information. Deep learning techniques could be broadly classified into three major categories: (i) deep networks for supervised or discriminative learning, such as convolutional neural networks and gradient descent; (ii) deep networks for unsupervised or generative learning, such as autoEcode, restricted boltzmann machine; and (iii) deep networks for hybrid learning and relevant others [9].

Among several AI algorithms applied to assist endoscopic diagnosis of early gastric cancer, Convolutional Neural Networks (CNNs) have gained increasing attention for their excellent image recognition learning ability. The main CNN models include Lenet, Alexnet, GoogleNet, VGG, ResNet, etc. The improved versions of CNN differ somewhat in the learning depth or organization of the models, but the institutional constructs that make up the models are the same, including convolutional operations, pooling operations, fully connected operations, and recognition operations [10]. The relationship between AI, machine learning and deep learning is shown in figure 1.

Recently, artificial intelligence is playing a role in clinical diagnostic areas increasingly. Especially in colonoscopy, it has been validated for its excellent adjunctive diagnostic value [11]. Similarly, more and more attention has also been paid about the diagnostic efficacy of UGIE [12-14]. 

Algorithmic models and various UGIE imaging techniques

In recent years, researchers have used AI to build multiple algorithmic models for application in various UGIE imaging techniques, which expand the means of diagnosing and evaluating early gastric cancer [15-19].

Video-based AI systems

Previously, the region of interest in the static screenshots was manually selected to distinguish gastric cancer from non-gastric cancerous lesions which improves the accuracy of diagnosis [15]. Currently, automatic capture of suspicious lesion areas is performed in dynamic real-time video in the latest research results.

M. Ishioka et al. used an AI system for recognition of still images to analyze video footage with 94.1% accuracy [20]. Y. Horiuchi et al. used a video-based AI system to achieve 85.1% accuracy in distinguishing early gastric cancer from non-cancerous lesions [21]. The successful application of the video-based CNN-CAD system demonstrates the potential of AI-assisted diagnosis with real-time video, which may be a reliable technique for clinicians to screen for early gastric cancer in the future.

The current video processing model is still a retrospective analysis of recorded video. Although some researches are conducting randomized clinical trials, they are still not proven real-time video evaluation systems during UGIE. Therefore, the application of AI-assisted diagnostic models to clinical examinations is worth exploring.

Optical endoscopy

Optical endoscopy was the first endoscopic imaging technique to be developed. Based on this technology, various innovative imaging techniques, such as endocytoscopy and imaging enhanced endoscopy, have been developed to improve the identify capability for early gastric cancer [22,23].

Magnifying narrow-band imaging (M-NBI), which analyses microstructure and microvasculature, is the most reliable tool available for evaluating early gastric cancer [24]. H. Ali et al. performed texture analysis of methylene blue-stained images by chromoendoscopy and used support vector machines to take construct models, which screens possible abnormal images from a large number of endoscopic images [16]. T. Hirasawa et al. applied CNN for the first time to construct a model for detecting gastric cancer in endoscopic images [17]. The model, based on standard white light images, pigmented endoscopy using indigo carmine spray and Narrow Band Imaging (NBI), could analyze 2296 static images in 47 seconds and correctly diagnose 71 out of 77 gastric cancer lesions. H. Ueyama et al. constructed an applied cad-CNN model based on ME-NBI static images to diagnose early gastric cancer with 98.7% accuracy [25]. H. Noda et al. have used the ResNet50 model to analyze cytoendoscopy images to distinguish early gastric cancer [26]. It has an accuracy of 86.1% and higher specificity than all endoscopists.

Ultrasound endoscopy

Due to showing the structure of the gastric wall, EUS allows preoperative staging and diagnosis of gastric cancer based on the depth of lesion infiltration [27]; and its accuracy has been demonstrated [28,29].

Y.H. Kim et al. developed a CNN-CAD system based on Endoscopic Ultrasound (EUS) to detect gastrointestinal mesenchymal tumors with accuracy, sensitivity and specificity of 79.2%, 83.0% and 75.5%, respectively [30]. H. Tanaka et al. constructed a residual neural network model based on contrast-enhanced harmonic endoscopic ultrasonography (CH-EUS) to diagnose Gastrointestinal Mesenchymal Tumors (GIST) and smooth muscle tumors with an accuracy of 90.6%, which is comparable to endoscopic [31].

However, several retrospective studies have also concluded that EUS is highly operator-dependent and susceptible to subjective and objective factors [32,33]. Therefore, the results of clinical studies are still heterogeneous and need to be further confirmed.

In conclusion, artificial intelligence-aided diagnostic techniques combine conventional optical endoscopy as well as imaging enhanced endoscopy, including narrow-band imaging, magnification endoscopy, pigment endoscopy, endocytoscopy, and endoscopic ultrasonography.

Through deep learning of lesion image features, early gastric cancer lesions in static images can be identified more accurately. The model of artificial intelligence-assisted recognition of endoscopic images has been initially validated during endoscopy, especially for real-time labeling of lesion areas, quality control and evaluation analysis.

Application of AI combined with UGIE in examination

Researchers are currently focusing on AI-assisted diagnosis of gastric cancer in terms of blind area identification, benign and malignant judgment, and infiltration depth detection.

Identification blind area

Compared to other gastrointestinal tract, the interior of the stomach is more spacious and curved for manipulation. This means that identifying blind areas requires more delicate manipulation, which is not possible for all endoscopists [34]. To locate blind spots that may be missed by the endoscopist during Esophagogastroduodenoscopy (EGD) examination, L. Wu et al. developed the WISENSE system, a real-time CNN for blind spot detection [18,35]. It was shown that AI can also be used to improve the quality of EGD by identifying blind spots. The blinding rate for AI-assisted sedated conventional Esophagogastroduodenoscopy (C-EGD) was significantly lower than for Unsedated Ultrathin Transoral Endoscopy (U-TOE) [36]. 

During UGIE, AI-assisted diagnosis improves the diagnostic efficacy of UGIE by judging the range of field of view and capturing suspicious sites in real time, reducing the heterogeneity caused by the endoscopist’s difference in experience. Thus, primary endoscopists benefit from the real-time feedback of the AI model and their diagnostic quality is significantly different from the past.

Determination of benign and malignant

Early gastric cancer usually shows a slight bulge or depression, and its appearance is easily concealed in gastritis with H. pylori infection. Even with experienced specialists, it is sometimes difficult to detect early GC based on endoscopic pictures alone, especially in small diameters. As a result, this leads to missed or misdiagnosis of gastric cancer and increases the heterogeneity of detection rates among endoscopists [37].

AI has shown outstanding recognition capabilities in endoscopic imaging techniques by actively learning through neural networks, extracting pathological features and fitting them [13]. H. Hu et al. analysed images from Magnified Endoscopic Narrow Band Imaging (ME-NBI) endoscopy based on the VGG-19 architecture and developed an EGCM model to identify early gastric cancer [38]. The model outperformed the endoscopist in identifying early gastric cancer from erosive gastritis. The Cascade R-CNN, based on a deep learning base model, can be used to detect lesions in white light still images; while the Dense Net 121 model was used to evaluate image quality [39].

In large-scale screening, the advantages of rapid image processing by artificial intelligence can be used to help endoscopists efficiently screen valuable endoscopic images and identify suspicious lesions

Infiltration depth detection

The depth of early gastric cancer infiltration is usually estimated using ultrasound endoscopy or conventional endoscopy. Ultrasound UGIE is judged by relatively visual ultrasound images of the entire gastric wall. In contrast, conventional optical endoscopy relies primarily on endoscopic mucosal surface features for indirect judgment, including hypertrophy or fusion of concentrated folds, tumor size ≥30 mm, significant redness, surface irregularity, rim elevation, submucosal tumor-like rim elevation, and non-extended signs [40]. Some studies have reported that conventional endoscopy is as accurate as EUS in predicting the depth of infiltration in early gastric cancer [41]; the accuracy of ultrasound endoscopy or optical endoscopy in estimating the depth of infiltration is 60% to 85% [41-45]. This suggests that the application of endoscopy to predict the depth of early gastric cancer infiltration still requires the search for new methods to further improve the diagnostic efficacy.

On the one hand, new endoscopic diagnostic features have been discovered, such as the non-extension sign [46]; on the other hand, it is possible to make a diagnosis based on AI-assisted endoscopic images in the future. K. Kubota et al. retrospectively collected 902 conventional endoscopic images of gastric cancer and obtained a T-stage prediction model with an overall accuracy of 64.7% by analyzing gray-scale static images from optical gastroscopy with a back-propagation algorithm [47]. This is the first study to use artificial intelligence-assisted gastroscopy for depth of gastric cancer infiltration prediction. Although the accuracy is not satisfactory, it is suggestive of its feasibility as an original study for predicting the depth of gastric cancer infiltration. Moreover, compared to gray-scale images, images from color endoscopy as well as image-enhanced endoscopy have richer information for diagnostic analysis. This has positive implications for improving diagnostic efficacy.

K. Hamada et al. applied ResNet152-CNN to distinguish intramucosal from submucosal carcinoma with sensitivity, specificity and accuracy of 84.9%, 70.7% and 78.9%, respectively [19]. The model mainly combined static image information from conventional endoscopy and enhanced image endoscopy, including white light imaging, correlated color imaging, blue laser imaging-bright, and indigo-carmine dye contrast imaging. Y. Zhu et al. applied the ResNet50 architecture to build a CNN-CAD system to predict the depth of infiltration and identify early gastric cancer with submucosal cancer, with an overall accuracy of 89.16% [48].

Summarily, it is feasible to apply CNN algorithm to assess the infiltration depth of early gastric cancer based on endoscopic images.

Current deficiencies and future prospects

As with other clinical studies, the quality and quantity of data is essential to the study. Inevitably, poor image quality can affect the diagnostic efficacy of the algorithm, including active bleeding, blurring, scatter, mucus, reflections, and foam. The collection of high-quality clinical data is important, as is the development of models that can accurately test the data. Patterns learned from training data are used to predict the output values of new input data. Thus, validity and accuracy depend largely on the quality and quantity of the training data. However, most of the current research is on monocentric data, which carries the risk of over fitting. Therefore, further external validation is the focus of the next work.

In addition, further research on the usefulness, profitability, possible risks, and regulatory measures of AI is needed for effective use of AI in clinical practice [49]. Again, further development will require significant resources, including long-term time accumulation, case accumulation, and the involvement of medical practitioners and engineers.

Further, most of the current studies only include common benign lesions such as gastritis and only a few common subtypes of gastric cancer. Therefore, there is still a gap between the current studies and the real world. If artificial intelligence is to be applied to clinical examination for real-time assisted gastroscopic diagnosis, further refinements are needed, including all lesions under UGIE, analysis of atypical lesions of as well as rare diseases [38].

Endoscopy, such as Image Enhanced Endoscopy IEE, has its own limitations. Although Japanese guidelines recommend the use of IEE, IEE can only evaluate the gastric mucosal surface. Some early gastric cancer, such as fundic glandular gastric adenocarcinoma, the status of early gastric cancer after H. pylori eradication, and some undifferentiated early gastric cancer, are difficult to diagnose as cancerous lesions [50]. In such lesions, the tumor occurs in the non-tumor epithelium or sub epithelium. Therefore, artificial intelligence systems will fail to recognize as neoplastic lesions and may misinterpret cancerous lesions as non-cancerous lesions

Although AI-assisted endoscopy can narrow the diagnostic gap due to clinical experience among endoscopists, the role of the endoscopist in clinical work remains irreplaceable. Even with AI models capable of automatically capturing suspicious areas, endoscopists still need to zoom in on areas and intercept static images during the examination before performing real-time analysis [38]. Therefore, the training of endoscopists and standardization of endoscopy practices in the future are still key aspects to improve the effectiveness of computer-aided diagnosis.

Some of the algorithmic models exhibited high diagnostic accuracy. However, heterogeneity exists between studies, so comparisons cannot be made simply by accuracy values. Furthermore, there is no conclusive evidence on the pairing of imaging techniques and optimal algorithms that can identify early gastric cancer [13], and further research is still needed.

Discussion

The prognosis of gastric cancer patients depends more on the stage at the time of diagnosis. 

Therefore, it is important to improve the diagnostic performance of gastric cancer through effective early screening means, and then take timely interventions to achieve secondary prevention. UGIE allows direct internal visualization of all parts of the upper gastrointestinal tract, from the proximal esophagus to the duodenal jugular. It is the preferred method in diagnosis of early diagnosis of gastric cancer.

UGIE mainly relied on the operator’s technique and diagnostic experience, etc. Coupled with the large workload of image analysis, misdiagnosis and missed diagnosis were inevitable even for experienced endoscopists. In recent years, with the in-depth research of artificial intelligence theory, machine learning had achieved extensive research in the field of gastric cancer with its efficient computing power and learning ability. Modern computer vision algorithms were applied to the processing of gastroscopic images to achieve automatic lesion annotation, feature extraction and conversion, and assisted diagnosis. It was promising to help endoscopists in the detection and screening of early cancerous lesions. 

Currently, the research hotspots of AI-assisted endoscopy are mainly focused on the identification of early gastric cancer. In future studies, endoscopic lesion characteristics can be used to predict lymph node metastasis, possible postoperative complications, response to drug therapy, and prognosis. All these directions are possible with AI and depend on further exploration.

Declarations

Conflict of interest: The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Author contributions

Conceptualization, ZY, JG, LX-H; investigation, YQ-C, SB; resources, YQ-C, SB; data curation, TD, YJ, WW-D, MZ-C, JP-P, YC, ZH-K, GR-Q; writing—original draft preparation, YQ-C, SB; writing—review and editing, YQ-C, SB; visualization, YQ-C, SB; supervision, ZY, JG, LX-H; project administration, ZY, JG, LX-H. All authors have read and agreed to the published version of the manuscript.

Funding: This research received no external funding.

Acknowledgments: The authors thank all the authors who contribute to this study.

Informed consent statement: Not applicable.

Data availability statement: Not applicable.

References

  1. Bray F, Ferlay J, Soerjomataram I, Siegel RL, Torre LA, et al. Global cancer statistics 2018: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries. CA Cancer J Clin. 2018; 68: 394-424.
  2. Ajani JA, Bentrem DJ, Besh S, D’Amico TA, Das P, et al. Gastric cancer, version 2. 2013: featured updates to the NCCN Guidelines. J Natl Compr Canc Netw. 2013; 11: 531-546.
  3. Săftoiu C, Hassan M, Areia MS, Bhutani R, Bisschops E, et al. Role of gastrointestinal endoscopy in the screening of digestive tract cancers in Europe: European Society of Gastrointestinal Endoscopy (ESGE) Position Statement. Endoscopy. 2020; 52: 293-304.
  4. Menon S, Trudgill N. How commonly is upper gastrointestinal cancer missed at endoscopy? A meta-analysis. Endoscopy international open. 2014; 2: E46-50.
  5. Pimenta-Melo AR, Monteiro-Soares M, Libânio D, Dinis-Ribeiro M. Missing rate for gastric cancer during upper gastrointestinal endoscopy: a systematic review and meta-analysis. European journal of gastroenterology & hepatology. 2016; 28: 1041-1049.
  6. Pasechnikov V, Chukov S, Fedorov E, Kikuste I, Leja M. Gastric cancer: prevention, screening and early diagnosis. World J Gastroenterol. 2014; 20: 13842-13862.
  7. Elkhader J, Elemento O. Artificial intelligence in oncology: From bench to clinic. Semin Cancer Biol. 2022; 84: 113-128.
  8. Subasi A. Chapter 3 - Machine learning techniques. in: A. Subasi, (Ed.), Practical Machine Learning for Data Analysis Using Python, Academic Press, 2020; 91-202.
  9. Sarker IH. Deep Learning: A Comprehensive Overview on Techniques, Taxonomy, Applications and Research Directions. SN Computer Science. 2021; 2: 420.
  10. Skansi S. Convolutional Neural Networks. in: S. Skansi, (Ed.), Introduction to Deep Learning: From Logical Calculus to Artificial Intelligence, Springer International Publishing, Cham. 2018; 121-133.
  11. Wang P, Berzin TM, Glissen Brown JR, Bharadwaj S, Becq A, et al. Liu, Real-time automatic detection system increases colonoscopic polyp and adenoma detection rates: a prospective randomised controlled study. Gut. 2019; 68: 1813-1819.
  12. Ahmad OF. Early detection of gastric neoplasia: is artificial intelligence the solution? Lancet Gastroenterol Hepatol. 2021; 6: 678-679.
  13. Chen PC, Lu YR, Kang YN, Chang CC. The Accuracy of Artificial Intelligence in the Endoscopic Diagnosis of Early Gastric Cancer: Pooled Analysis Study. J Med Internet Res. 2022; 24: e27694.
  14. Hirasawa T, Ikenoyama K, Ishioka M, Namikawa K, Horiuchi K, et al. Current status and future perspective of artificial intelligence applications in endoscopic diagnosis and management of gastric cancer. Dig Endosc. 2021; 33: 263-272.
  15. Miyaki R, Yoshida S, Tanaka S, Kominami Y, Sanomura Y, et al. A computer system to be used with laser-based endoscopy for quantitative diagnosis of early gastric cancer. J Clin Gastroenterol. 2015; 49: 108-115.
  16. Ali H, Yasmin M, Sharif M, Rehmani MH. Computer assisted gastric abnormalities detection using hybrid texture descriptors for chromoendoscopy images. Comput Methods Programs Biomed. 2018; 157: 39-47.
  17. Hirasawa T, Aoyama K, Tanimoto T, Ishihara S, Shichijo S, et al. Application of artificial intelligence using a convolutional neural network for detecting gastric cancer in endoscopic images. Gastric Cancer. 2018; 21: 653-660.
  18. Wu L, Zhou W, Wan X, Zhang J, Shen L, et al. A deep neural network improves endoscopic detection of early gastric cancer without blind spots. Endoscopy. 2019; 51: 522-531.
  19. Hamada K, Kawahara Y, Tanimoto T, Ohto A, Toda A, et al. Application of convolutional neural networks for evaluating the depth of invasion of early gastric cancer based on endoscopic images. J Gastroenterol Hepatol. 2022; 37: 352-357.
  20. Ishioka M, Hirasawa T, Tada T. Detecting gastric cancer from video images using convolutional neural networks. Dig Endosc. 2019; 31: e34-e35.
  21. Horiuchi Y, Hirasawa T, Ishizuka N, Tokai Y, Namikawa K, et al. Performance of a computer-aided diagnosis system in diagnosing early gastric cancer using magnifying endoscopy videos with narrow-band imaging (with videos). Gastrointest Endosc. 2020; 92: 856-865.e1.
  22. East JE, Vleugels JL, Roelandt P, Bhandari P, Bisschops R, et al. Advanced endoscopic imaging: European Society of Gastrointestinal Endoscopy (ESGE) Technology Review. Endoscopy. 2016; 48: 1029-1045.
  23. Neumann H, Fuchs FS, Vieth M, Atreya R, Siebler J, et al. Review article: in vivo imaging by endocytoscopy. Aliment Pharmacol Ther. 2011; 33: 1183-1193.
  24. Zhang Q, Wang F, Chen ZY, Wang Z, Zhi FC, et al. Comparison of the diagnostic efficacy of white light endoscopy and magnifying endoscopy with narrow band imaging for early gastric cancer: a meta-analysis. Gastric Cancer. 2016; 19: 543-552.
  25. Ueyama H, Kato Y, Akazawa Y, Yatagai N, Komori H, et al. Application of artificial intelligence using a convolutional neural network for diagnosis of early gastric cancer based on magnifying endoscopy with narrow-band imaging. J Gastroenterol Hepatol. 2021; 36: 482-489.
  26. Noda H, Kaise M, Higuchi K, Koizumi E, Yoshikata K, et al. Convolutional neural network-based system for endocytoscopic diagnosis of early gastric cancer. BMC Gastroenterol. 2022; 22: 237.
  27. Cardoso R, Coburn N, Seevaratnam R, Sutradhar R, Lourenco LG, et al. A systematic review and meta-analysis of the utility of EUS for preoperative staging for gastric cancer. Gastric Cancer. 2012; 15: S19-26.
  28. Mocellin S, Pasquali S. Diagnostic accuracy of endoscopic ultrasonography (EUS) for the preoperative locoregional staging of primary gastric cancer. Cochrane Database Syst Rev. 2015; Cd009944.
  29. Nie RC, Yuan SQ, Chen XJ, Chen S, Xu LP, et al. Endoscopic ultrasonography compared with multidetector computed tomography for the preoperative staging of gastric cancer: a meta-analysis. World J Surg Oncol. 2017; 15: 113.
  30. Kim YH, Kim GH, Kim KB, Lee MW, Lee BE, et al. Application of A Convolutional Neural Network in The Diagnosis of Gastric Mesenchymal Tumors on Endoscopic Ultrasonography Images. J Clin Med. 2020; 9.
  31. Tanaka H, Kamata K, Ishihara R, Handa H, Otsuka Y, et al. Value of artificial intelligence with novel tumor tracking technology in the diagnosis of gastric submucosal tumors by contrast-enhanced harmonic endoscopic ultrasonography. J Gastroenterol Hepatol. 2022; 37: 841-846.
  32. Shi D, Xi XX. Factors Affecting the Accuracy of Endoscopic Ultrasonography in the Diagnosis of Early Gastric Cancer Invasion Depth: A Meta-analysis. Gastroenterol Res Pract. 2019; 2019: 8241381.
  33. Tsujii Y, Kato M, Inoue Y, Yoshii S, Nagai K, et al. Integrated diagnostic strategy for the invasion depth of early gastric cancer by conventional endoscopy and EUS. Gastrointest Endosc. 2015; 82: 452-459.
  34. O’Mahony S, Naylor G, Axon A. Quality assurance in gastrointestinal endoscopy. Endoscopy. 2000; 32: 483-488.
  35. Wu L, Zhang J, Zhou W, An P, Shen L, et al. Randomised controlled trial of WISENSE, a real-time quality improving system for monitoring blind spots during esophagogastroduodenoscopy. Gut. 2019; 68: 2161-2169.
  36. Chen D, Wu L, Li Y, Zhang J, Liu J, et al. Comparing blind spots of unsedated ultrafine, sedated, and unsedated conventional gastroscopy with and without artificial intelligence: a prospective, single-blind, 3-parallel-group, randomized, single-center trial. Gastrointest Endosc. 2020; 91: 332-339.e3.
  37. Lee HL, Eun CS, Lee OY, Han DS, Yoon BC, et al. When do we miss synchronous gastric neoplasms with endoscopy? Gastrointest Endosc. 2010; 71: 1159-1165.
  38. Hu H, Gong L, Dong D, Zhu L, Wang M, et al. Identifying early gastric cancer under magnifying narrow-band images with deep learning: a multicenter study. Gastrointestinal endoscopy. 2021; 93: 1333-1341.e3.
  39. Oura H, Matsumura T, Fujie M, Ishikawa T, Nagashima A, et al. Development and evaluation of a double-check support system using artificial intelligence in endoscopic screening for gastric cancer. Gastric Cancer. 2022; 25: 392-400.
  40. Yao K, Uedo N, Kamada T, Hirasawa T, Nagahama T, et al. Guidelines for endoscopic diagnosis of early gastric cancer. Dig Endosc. 2020; 32: 663-698.
  41. Choi J, Kim SG, Im JP, Kim JS, Jung HC, et al. Comparison of endoscopic ultrasonography and conventional endoscopy for prediction of depth of tumor invasion in early gastric cancer. Endoscopy. 2010; 42: 705-713.
  42. Abe S, Oda I, Shimazu T, Kinjo T, Tada K, et al. Depth-predicting score for differentiated early gastric cancer. Gastric Cancer. 2011; 14: 35-40.
  43. Sano T, Okuyama Y, Kobori O, Shimizu T, Morioka Y. Early gastric cancer. Endoscopic diagnosis of depth of invasion. Dig Dis Sci. 1990; 35: 1340-1344.
  44. Shimoyama S, Yasuda H, Hashimoto M, Tatsutomi Y, Aoki F, et al. Accuracy of linear-array EUS for preoperative staging of gastric cardia cancer. Gastrointest Endosc. 2004; 60: 50-55.
  45. Wakelin SJ, Deans C, Crofts TJ, Allan PL, Plevris JN, et al. A comparison of computerised tomography, laparoscopic ultrasound and endoscopic ultrasound in the preoperative staging of oesophago-gastric carcinoma. Eur J Radiol. 2002; 41: 161-167.
  46. Nagahama T, Yao K, Imamura K, Kojima T, Ohtsu K, et al. Diagnostic performance of conventional endoscopy in the identification of submucosal invasion by early gastric cancer: the “non-extension sign” as a simple diagnostic marker. Gastric Cancer. 2017; 20: 304-313.
  47. Kubota K, Kuroda J, Yoshida M, Ohta K, Kitajima M. Medical image analysis: computer-aided diagnosis of gastric cancer invasion on endoscopic images. Surg Endosc. 2012; 26: 1485-1489.
  48. Zhu Y, Wang QC, Xu MD, Zhang Z, Cheng J, et al. Application of convolutional neural network in the diagnosis of the invasion depth of gastric cancer based on conventional endoscopy. Gastrointest Endosc. 2019; 89: 806-815 e1.
  49. Kim JH, Nam SJ, Park SC. Usefulness of artificial intelligence in gastric neoplasms. World journal of gastroenterology. 2021; 27: 3543-3555.
  50. Matsumoto K, Ueyama H, Yao T, Abe D, Oki S, et al. Diagnostic limitations of magnifying endoscopy with narrow-band imaging in early gastric cancer. Endosc Int Open. 2020; 8: E1233-E1242.