Next Article in Journal
Cyber-Enabled Optimization of HVAC System Control in Open Space of Office Building
Previous Article in Journal
DVDR-SRGAN: Differential Value Dense Residual Super-Resolution Generative Adversarial Network
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Communication

Automatic Identification of Ultrasound Images of the Tibial Nerve in Different Ankle Positions Using Deep Learning

1
Inclusive Medical Science Research Institute, Morinomiya University of Medical Sciences, Osaka 559-8611, Japan
2
Department of Rehabilitation, Kano General Hospital, Osaka 531-0041, Japan
3
Department of Radiological Sciences, Faculty of Health Sciences, Morinomiya University of Medical Sciences, Osaka 559-8611, Japan
4
Graduate School of Health Science, Morinomiya University of Medical Sciences, Osaka 559-8611, Japan
5
Department of Physical Therapy, Morinomiya University of Medical Sciences, Osaka 559-8611, Japan
6
AR-Ex Medical Research Center, Tokyo 158-0082, Japan
*
Author to whom correspondence should be addressed.
Sensors 2023, 23(10), 4855; https://doi.org/10.3390/s23104855
Submission received: 4 May 2023 / Revised: 15 May 2023 / Accepted: 16 May 2023 / Published: 18 May 2023
(This article belongs to the Section Sensing and Imaging)

Abstract

:
Peripheral nerve tension is known to be related to the pathophysiology of neuropathy; however, assessing this tension is difficult in a clinical setting. In this study, we aimed to develop a deep learning algorithm for the automatic assessment of tibial nerve tension using B-mode ultrasound imaging. To develop the algorithm, we used 204 ultrasound images of the tibial nerve in three positions: the maximum dorsiflexion position and −10° and −20° plantar flexion from maximum dorsiflexion. The images were taken of 68 healthy volunteers who did not have any abnormalities in the lower limbs at the time of testing. The tibial nerve was manually segmented in all images, and 163 cases were automatically extracted as the training dataset using U-Net. Additionally, convolutional neural network (CNN)-based classification was performed to determine each ankle position. The automatic classification was validated using five-fold cross-validation from the testing data composed of 41 data points. The highest mean accuracy (0.92) was achieved using manual segmentation. The mean accuracy of the full auto-classification of the tibial nerve at each ankle position was more than 0.77 using five-fold cross-validation. Thus, the tension of the tibial nerve can be accurately assessed with different dorsiflexion angles using an ultrasound imaging analysis with U-Net and a CNN.

1. Introduction

The tibial nerve is a branch of the sciatic nerve that arises at the apex of the popliteal fossa, continuing its course down the leg, posterior to the tibia, and running posteriorly and inferiorly to the medial malleolus through a structure known as the tarsal tunnel [1]. Since it has both motor and sensory functions, compressive neuropathies of the tibial nerve and its branches to the hindfoot can cause heel pain; these can be due to tarsal tunnel syndrome, nerve entrapment, or diabetes-related neuropathy. Some patients with plantar heel pain are diagnosed with plantar fasciitis complicated by tarsal tunnel syndrome [2]. Diabetic peripheral neuropathy, one of the major complications of diabetes mellitus, is difficult to diagnose accurately [3] and can lead to serious complications [4]. Thus, an objective and quantitative assessment method of the condition of the tibial nerve is crucial.
The nerve conduction study (NCS), which is currently the gold standard method for tibial nerve assessment, can detect neuropathy by detecting the conduction velocity; however, it is time consuming and invasive and easily influenced by skin temperature and humidity [5]. Recently, high-resolution ultrasound (US) has been used to diagnose neuromuscular diseases [6], and shear wave elastography (SWE) has been shown to reflect peripheral nerve stiffness [7,8]. However, SWE for peripheral nerves does not have enough reliability in a clinical setting [7]. Therefore, there is a paucity of precise and reliable diagnostic procedures for the assessment of peripheral nerves using either simple or objective methods.
US imaging devices are ideal for screening in the clinical setting because of their portability, low cost, and noninvasive nature. However, it is difficult to accurately diagnose conditions from US images; diagnostic accuracy depends, to a large extent, on the experience and skill of the examiner. In recent years, advances have been made in the automatic assessment of US images using deep learning [9]; however, there have been very few applications for the neuromuscular system.
Recent ultrasonographic studies report that changes in tibial nerve morphology, such as the cross-sectional area [10] and nerve stiffness [8], are crucial factors in peripheral neuropathy of the foot. The tension of the tibial nerve is known to increase with increasing ankle dorsiflexion. Thus, developing an automatic assessment system for B-mode US images of the tibial nerve at different ankle dorsiflexion positions can provide the basis for an accurate diagnosis of plantar heel pain and diabetic peripheral neuropathy. However, time is limited in clinical settings, so deep learning assessments of the tibial nerve are considered helpful.
In this context, the aim of this study was to develop an automatic method for assessing B-mode US imaging of the tibial nerve in different ankle positions using deep learning.

2. Materials and Methods

2.1. Participants

We investigated 68 right ankles of 68 healthy adults (33 men and 35 women; mean age: 20.4 ± 0.7 years; height: 165.1 ± 8.0 cm; weight: 57.7 ± 8.9 kg). A history of orthopedic or neurological disease of the lower limbs or trunk was considered a criterion for exclusion. Ethics approval was granted by the University Ethics Committee (authorization no. 2022-046). Informed consent was obtained from all participants prior to testing. This study was conducted in accordance with the principles of the Declaration of Helsinki.

2.2. Ultrasound Image Capturing Method

The tibial nerve was assessed using a B-mode US imaging system (Aplio 300; Canon Medical Systems, Tokyo, Japan) with a 10 MHz linear transducer (PLT-1005BT; Canon Systems, Tokyo, Japan). The imaging area was set based on previous studies [11,12]. The tibial nerve was located using a transverse scan, approximately 1 cm superior to the medial malleolus (Figure 1a), and it was identified using the tibial artery as a landmark (Figure 1b). The nerve was positioned at the center of the screen. The US transducer was rotated by 90° and aligned longitudinally along the tibial nerve plane (Figure 1c). The probe was fixed with a thermoplastic fixture and an elastic bandage. For measurements, the participant was seated on a Biodex 4 isokinetic dynamometer device (Biodex Medical System Inc., New York, NY, USA) and the ankle was fixed to the footplate. The participants were seated in the mid-neck position with 90° flexion of the hip and 30° flexion of the knee. The neck and trunk were rested on the headrest and backrest, respectively. The trunk and right thigh were fixed using a belt, and the participant’s ankle was placed in the maximum passive dorsiflexion position. The tibial nerve was assessed using a B-mode US imaging system by a physical therapist with two years of research experience in US imaging and 10 years of experience in a hospital rehabilitation department.
The motion task consisted of repetitive movements in a range of 20° from the maximum dorsiflexion position of the ankle in the direction of plantar flexion. The ankle was moved at a constant velocity of 30°/s; 0.67 s was allowed for each dorsiflexion and plantar flexion and 0.33 s was allowed to switch the direction of movement. After confirming that the participant was relaxed, the dynamics of the tibial nerve were imaged while moving the ankle from plantar flexion to dorsiflexion in three trials (frame rate, 60; depth, 4.5 cm).
US videos of the tibial nerve were converted to still images in three ankle positions: at the maximum dorsiflexion position of the ankle and at angles of −10° and −20° from the maximum dorsiflexion of the ankle (Figure 2 and Figure 3). A total of 204 data points were collected for analysis. The Portable Network Graphics (PNG) format was used for the two-dimensional US images with a pixel size of 1536 × 1024.

2.3. Manual Segmentation

The US image analysis procedure is shown in Figure 4. The tibial nerve regions in each US image classifying three different ankle joint angles were extracted for classification and used as U-Net training images. A total of 204 still images (1536 × 1024 pixels) classified as three different ankle joint angles were manually segmented for the tibial nerve using “labelme”, an annotation tool developed by a physical therapist with six years of research experience in US imaging and 15 years of experience in a hospital rehabilitation department.

2.4. Convolutional Neural Network (CNN)

There are many deep learning methods for classification and identification. In this study, we chose a CNN, the simplest classification method, to assess the tibial nerve. The learning and test processes were run using a graphic processing unit (NVIDIA GeForce RTX 3080 Ti 12GB of memory), TensorFlow 2.4, and Keras 2.4. The input data were downsized to 384 × 256 pixels to prevent memory overflow. The CNN architecture was constructed as a simple three-layer structure, as shown in Figure 5. This model predicted three classes: maximum dorsiflexion (md) and md −20° and md −10° positions. One layer consisted of two convolutions (kernel size was 3 × 3), a rectified linear unit (ReLU), and a max pooling operation (2 × 2) for downsampling.
Three datasets were used as the input images. Three datasets had 204 data for each of the 68 participants. The first dataset was a raw US image that contained Digital Imaging and Communications in Medicine (DICOM) tag information strings, such as the acquisition conditions and distances. The second dataset was the processed image with strings deleted from the original image. The third dataset was the image obtained by manually extracting only the tibial nerve from the original image. In addition, the pixel size did not change for any image.

2.5. U-Net

The U-Net architecture was used to segment the tibial nerve. U-Net is a commonly used technique for the segmentation of biological images. This architecture is composed of encoding and decoding paths, as shown in Figure 6. The process of each layer in the encoding path consisted of a 4 × 4 convolution, a leaky rectified linear unit (LReLU), and a 2 × 2 max pooling operation for downsampling. The process of each layer in the coding path was composed of a 2 × 2 deconvolution, an ReLU, and concatenation with the corresponding cropped feature map from the encoding path for upsampling. The Dice similarity coefficient loss in the output layer was used in this study.

2.6. Validation

Five-fold cross-validation was used to evaluate the accuracy of prediction using the CNN learning model. The training and test groups were randomly divided into five sets, as shown in Table 1.
The training and test groups were randomly divided into five sets. In each set, the learning model was calculated using the images of the training group, and the classes of the test group were predicted using the learning model.
In each set, the learning model was calculated using the images of the training group, and the classes of the test group were predicted using the learning model. The accuracy of each set was evaluated for each of the three types of input datasets: raw images, images with DICOM tag information removed, and manually extracted images of the tibial nerve. Finally, the validation of auto-segmentation using U-Net was assessed by the intersection over union (IoU) and cross-sectional area ratio (CSAR). We used both metrics to measure the accuracy of our model by evaluating the area ratio and the alignment of the predicted and ground truth images [13].

3. Results

3.1. Segmentation

The average accuracies of tibial nerve segmentation using U-Net were verified as 0.81 for IoU and 0.98 for CSAR (Table 2).
The validation of auto-segmentation using U-Net was assessed by the intersection over union and the cross-sectional area ratio.

3.2. Classification

The five-fold cross-validation results showed that the accuracy of the raw data was low (0.44). However, the manual and full auto segmentation accuracies were higher at 0.92 and 0.77, respectively (Table 3). The F values for the raw data could not be calculated as the values diverged. On the other hand, the manual and full auto segmentation F-values were higher at 0.92 and 0.76, respectively (Table 3).
In each set, the learning model was calculated using the images of the training group, and the classes of the test group were predicted using the learning model.

4. Discussion

The aim of this study was to develop an automatic assessment of tibial nerve tension using B-mode US imaging with deep learning. We demonstrated a high accuracy in the classification of the tibial nerve for each ankle position (more than 77%) by automatically extracting U-Net and CNN-based classification.
The applications of deep learning techniques in medical imaging analyses are divided into three types of tasks: “classification”, “detection”, and “segmentation”. There have been few studies on all these types of tasks in US imaging [9].
Regarding segmentation, several prior studies have investigated cardiac and fetal systems [14,15,16]; however, there are very few deep learning studies on the musculoskeletal system. Belasso et al. [17] showed that US images of the lumbar multifidus muscle could be segmented automatically. Nevertheless, few studies have segmented long-axis images of peripheral nerves.
The classification of images is accomplished by identifying certain anatomical or pathological features that can discriminate one anatomical structure or tissue from others [9]. Studies using this technique have been conducted to diagnose breast tumors [18], liver cancer [19], thyroid nodules [20], and fetal conditions [21].
The results of the present study indicate that automatic segmentation of the tibial nerve and its changes with ankle position can be classified with a high accuracy using deep learning. The tension of the tibial nerve increases with dorsiflexion of the ankle [22,23]; therefore, the US images of the tibial nerve can be classified according to the ankle position, which is equivalent to distinguishing the tension of the tibial nerve.
It is well known that the separation of the target from surrounding structures is difficult in low-contrast US images [9]. Peripheral nerves are surrounded by an epineural membrane that separates them from the surrounding tissues. Therefore, peripheral nerves can be segmented with a high accuracy. Moreover, stretching the tibial nerve during dorsiflexion of the ankle alters the image of the nerve bundles and perineurium within the nerve, since it has the inherent ability of excursion and stretching with the motion of the limb [24]. Therefore, automatic assessments of the tibial nerve in different ankle positions can achieve high accuracy.
Manual segmentation methods are time consuming but reliable. It is known that the tension of a peripheral nerve can be quantitatively assessed using SWE. US imaging has several advantages compared to other medical imaging modalities, including portability, accessibility, and cost-effectiveness; in addition, only high-end models are available for SWE. Therefore, an automatic assessment system may be useful in the clinical screening of peripheral nerve neuropathy.
This study has some limitations which should be mentioned. First, different US devices may have lower accuracies. Second, only young, healthy volunteers were included in this study. Therefore, it is necessary to perform the same investigation in older adults and individuals with various neuropathies and other conditions.

5. Conclusions

The automatic classification was validated using five-fold cross-validation from the testing data composed of 41 data points. The highest mean accuracy (0.92) was achieved using manual segmentation. The mean accuracy of the full auto-classification from the tibial nerve at each ankle position was more than 0.77 using five-fold cross-validation. The tension of the tibial nerve can be accurately assessed with different dorsiflexion angles using US imaging analyses with U-Net and a CNN.

Author Contributions

Conceptualization, S.K. and K.K.; methodology, S.K., K.K., A.K. and K.A.; software, S.K. and A.K.; validation, S.K., K.K. and A.K.; formal analysis, A.K.; investigation, K.A. and K.K.; resources, S.K. and A.K.; data curation, K.K. and A.K.; writing—original draft preparation, S.K., K.K. and A.K.; writing—review and editing, K.K., A.K., K.A., M.T., I.Y., S.K. and M.T.; visualization, S.K., A.K. and K.K.; supervision, S.K.; project administration, K.K. and S.K.; funding acquisition, S.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Ethics Committee of Morinomiya University of Medical Sciences (2021-069).

Informed Consent Statement

Written informed consent was obtained from the patients to publish this paper.

Data Availability Statement

The datasets generated and/or analyzed during the current study are available from the corresponding author upon reasonable request.

Acknowledgments

The authors thank all participants for their time and cooperation throughout the study.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Moretti, E.; da Silva, I.B.; Boaviagem, A.; Barbosa, L.; de Lima, A.M.J.; Lemos, A. ‘Posterior Tibial Nerve’ or “Tibial Nerve”? Improving the Reporting in Health Papers. Neurourol. Urodyn. 2020, 39, 847–853. [Google Scholar] [CrossRef] [PubMed]
  2. Mulherin, D.; Price, M. Efficacy of Tibial Nerve Block, Local Steroid Injection or Both in the Treatment of Plantar Heel Pain Syndrome. Foot 2009, 19, 98–100. [Google Scholar] [CrossRef] [PubMed]
  3. Dyck, P.J.; Overland, C.J.; Low, P.A.; Litchy, W.J.; Davies, J.L.; Dyck, P.J.; O’Brien, P.C.; Cl vs. NPhys Trial Investigators; Albers, J.W.; Andersen, H.; et al. Signs and Symptoms versus Nerve Conduction Studies to Diagnose Diabetic Sensorimotor Polyneuropathy: Cl vs. NPhys Trial. Muscle Nerve 2010, 42, 157–164. [Google Scholar] [CrossRef]
  4. Turns, M. The Diabetic Foot: An Overview of Assessment and Complications. Br. J. Nurs. 2011, 20 (Suppl. S8), S19–S25. [Google Scholar] [CrossRef]
  5. Çolak, A.; Kutlay, M.; Pekkafali, Z.; Saraçoglu, M.; Demircan, N.; Simşek, H.; Akin, O.N.; Kibici, K. Use of Sonography in Carpal Tunnel Syndrome Surgery. A Prospective Study. Neurol. Med. Chir. 2007, 47, 109–115. [Google Scholar] [CrossRef]
  6. Kerasnoudis, A.; Tsivgoulis, G. Nerve Ultrasound in Peripheral Neuropathies: A Review. J. Neuroimaging 2015, 25, 528–538. [Google Scholar] [CrossRef]
  7. Aslan, M.; Aslan, A.; Emeksiz, H.C.; Candan, F.; Erdemli, S.; Tombul, T.; Gunaydın, G.D.; Kabaalioğlu, A. Assessment of Peripheral Nerves with Shear Wave Elastography in Type 1 Diabetic Adolescents without Diabetic Peripheral Neuropathy. J. Ultrasound Med. 2019, 38, 1583–1596. [Google Scholar] [CrossRef]
  8. He, Y.; Xiang, X.; Zhu, B.H.; Qiu, L. Shear Wave Elastography Evaluation of the Median and Tibial Nerve in Diabetic Peripheral Neuropathy. Quant. Imaging Med. Surg. 2019, 9, 273–282. [Google Scholar] [CrossRef]
  9. Liu, S.; Wang, Y.; Yang, X.; Lei, B.; Liu, L.; Li, S.X.; Ni, D.; Wang, T. Deep Learning in Medical Ultrasound Analysis: A Review. Engineering 2019, 5, 261–275. [Google Scholar] [CrossRef]
  10. Fantino, O.; Bouysset, M.; Pialat, J.B. Can the axial cross-sectional area of the tibial nerve be used to diagnose tarsal tunnel syndrome? An ultrasonography study. Orthop. Traumatol. Surg. Res. 2021, 107, 102630. [Google Scholar] [CrossRef]
  11. Carroll, M.; Yau, J.; Rome, K.; Hing, W. Measurement of Tibial Nerve Excursion during Ankle Joint Dorsiflexion in a Weight-Bearing Position with Ultrasound Imaging. J. Foot Ankle Res. 2012, 5, 5. [Google Scholar] [CrossRef] [PubMed]
  12. Kawanishi, K.; Nariyama, Y.; Anegawa, K.; Tsutsumi, M.; Kudo, S. Changes in Tibial Nerve Stiffness during Ankle Dorsiflexion According to In-Vivo Analysis with Shear Wave Elastography. Medicine 2022, 101, e29840. [Google Scholar] [CrossRef] [PubMed]
  13. Hashimoto, F.; Kakimoto, A.; Ota, N.; Ito, S.; Nishizawa, S. Automated segmentation of 2D low-dose CT images of the psoas-major muscle using deep convolutional neural networks. Radiol. Phys. Technol. 2019, 12, 210–215. [Google Scholar] [CrossRef] [PubMed]
  14. An, S.; Zhou, X.; Zhu, H.; Zhou, F.; Wu, Y.; Yang, T.; Liu, X.; Zhang, Y.; Jiao, Z.; He, Y. Simultaneous Segmentation of Four Cardiac Chambers in Fetal Echocardiography. In Proceedings of the 2021 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Mexico City, Mexico, 1–5 November 2021; pp. 3122–3126. [Google Scholar] [CrossRef]
  15. Haak, A.; Vegas-Sánchez-Ferrero, G.; Mulder, H.W.; Ren, B.; Kirişli, H.A.; Metz, C.; van Burken, G.; van Stralen, M.; Pluim, J.P.; van der Steen, A.F.; et al. Segmentation of Multiple Heart Cavities in 3-D Transesophageal Ultrasound Images. IEEE Trans. Ultrason. Ferroelectr. Freq. Control 2015, 62, 1179–1189. [Google Scholar] [CrossRef] [PubMed]
  16. Lei, Y.; Fu, Y.; Roper, J.; Higgins, K.; Bradley, J.D.; Curran, W.J.; Liu, T.; Yang, X. Echocardiographic Image Multi-Structure Segmentation Using Cardiac-SegNet. Med. Phys. 2021, 48, 2426–2437. [Google Scholar] [CrossRef] [PubMed]
  17. Belasso, C.J.; Behboodi, B.; Benali, H.; Boily, M.; Rivaz, H.; Fortin, M. LUMINOUS Database: Lumbar Multifidus Muscle Segmentation from Ultrasound Images. BMC Musculoskelet. Disord. 2020, 21, 703. [Google Scholar] [CrossRef] [PubMed]
  18. Jamieson, A.R.; Drukker, K.; Giger, M.L. Breast Image Feature Learning with Adaptive Deconvolutional Networks. SPIE Proc. 2012, 8315, 831506. [Google Scholar] [CrossRef]
  19. Guo, L.; Wang, D.; Xu, H.; Qian, Y.; Wang, C.; Zheng, X.; Zhang, Q.; Shi, J. CEUS-Based Classification of Liver Tumors with Deep Canonical Correlation Analysis and Multi-kernel Learning. In Proceedings of the 2017 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Jeju, Republic of Korea, 11–15 July 2017; pp. 1748–1751. [Google Scholar] [CrossRef]
  20. Ma, J.; Wu, F.; Zhu, J.; Xu, D.; Kong, D. A Pre-trained Convolutional Neural Network Based Method for Thyroid Nodule Diagnosis. Ultrasonics 2017, 73, 221–230. [Google Scholar] [CrossRef]
  21. Wu, L.; Cheng, J.Z.; Li, S.; Lei, B.; Wang, T.; Ni, D. FUIQA: Fetal Ultrasound Image Quality Assessment with Deep Convolutional Networks. IEEE Trans. Cybern. 2017, 47, 1336–1349. [Google Scholar] [CrossRef]
  22. Alshami, A.M.; Babri, A.S.; Souvlis, T.; Coppieters, M.W. Strain in the Tibial and Plantar Nerves with Foot and Ankle Movements and the Influence of Adjacent Joint Positions. J. Appl. Biomech. 2008, 24, 368–376. [Google Scholar] [CrossRef]
  23. Coppieters, M.W.; Alshami, A.M.; Babri, A.S.; Souvlis, T.; Kippers, V.; Hodges, P.W. Strain and Excursion of the Sciatic, Tibial, and Plantar Nerves during a Modified Straight Leg Raising Test. J. Orthop. Res. 2006, 24, 1883–1889. [Google Scholar] [CrossRef] [PubMed]
  24. Sunderland, S.S. The Anatomy and Physiology of Nerve Injury. Muscle Nerve 1990, 13, 771–784. [Google Scholar] [CrossRef] [PubMed]
Figure 1. B-mode ultrasound images of the captured location and the tibial nerve. (a) Probe location. (b) Short-axis image of the tibial nerve. (c) Long-axis image of the tibial nerve.
Figure 1. B-mode ultrasound images of the captured location and the tibial nerve. (a) Probe location. (b) Short-axis image of the tibial nerve. (c) Long-axis image of the tibial nerve.
Sensors 23 04855 g001
Figure 2. Three ankle positions: at the maximum dorsiflexion position of the ankle and at angles of −10° and −20° from the maximum dorsiflexion of the ankle.
Figure 2. Three ankle positions: at the maximum dorsiflexion position of the ankle and at angles of −10° and −20° from the maximum dorsiflexion of the ankle.
Sensors 23 04855 g002
Figure 3. Morphology of the tibial nerve in three different ankle joint angles. (a) −20° from maximum dorsiflexion. (b) −10° from maximum dorsiflexion. (c) Maximum dorsiflexion.
Figure 3. Morphology of the tibial nerve in three different ankle joint angles. (a) −20° from maximum dorsiflexion. (b) −10° from maximum dorsiflexion. (c) Maximum dorsiflexion.
Sensors 23 04855 g003
Figure 4. The US image analysis procedure.
Figure 4. The US image analysis procedure.
Sensors 23 04855 g004
Figure 5. Structure of the CNN. The CNN architecture was constructed as a simple three-layer structure.
Figure 5. Structure of the CNN. The CNN architecture was constructed as a simple three-layer structure.
Sensors 23 04855 g005
Figure 6. Architecture of the U-Net. This architecture is composed of encoding and decoding paths.
Figure 6. Architecture of the U-Net. This architecture is composed of encoding and decoding paths.
Sensors 23 04855 g006
Table 1. Classification for five-fold cross validation.
Table 1. Classification for five-fold cross validation.
ClassSet 1Set 2Set 3Set 4Set 5
TrainingTestTrainingTestTrainingTestTrainingTestTrainingTest
Md −205315511759957115216
Md −105995117501857115513
Md5117617541449195711
Table 2. Accuracy of tibial nerve segmentation using U-Net.
Table 2. Accuracy of tibial nerve segmentation using U-Net.
Intersection over UnionCross-Sectional Area Ratio
10.811
20.790.99
30.800.94
40.811
50.820.98
Accuracy Average0.810.98
Table 3. Results of five-fold cross-validation.
Table 3. Results of five-fold cross-validation.
Raw DataManual SegmentationFull Auto Segmentation
AccuracyF-ValueAccuracyF-ValueAccuracyF-Value
10.37 -0.98 0.98 0.83 0.82
20.59 -0.95 0.95 0.80 0.79
30.22 -0.83 0.83 0.66 0.67
40.27 -0.93 0.93 0.80 0.81
50.73 -0.93 0.93 0.75 0.73
Average0.44 -0.92 0.92 0.77 0.76
F values for the raw data could not calculated as the values diverged.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kawanishi, K.; Kakimoto, A.; Anegawa, K.; Tsutsumi, M.; Yamaguchi, I.; Kudo, S. Automatic Identification of Ultrasound Images of the Tibial Nerve in Different Ankle Positions Using Deep Learning. Sensors 2023, 23, 4855. https://doi.org/10.3390/s23104855

AMA Style

Kawanishi K, Kakimoto A, Anegawa K, Tsutsumi M, Yamaguchi I, Kudo S. Automatic Identification of Ultrasound Images of the Tibial Nerve in Different Ankle Positions Using Deep Learning. Sensors. 2023; 23(10):4855. https://doi.org/10.3390/s23104855

Chicago/Turabian Style

Kawanishi, Kengo, Akihiro Kakimoto, Keisuke Anegawa, Masahiro Tsutsumi, Isao Yamaguchi, and Shintarou Kudo. 2023. "Automatic Identification of Ultrasound Images of the Tibial Nerve in Different Ankle Positions Using Deep Learning" Sensors 23, no. 10: 4855. https://doi.org/10.3390/s23104855

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop