Automatic Assessment of OCT Skin Thickness in Mice Based on Transfer Learning and Attention Mechanisms
DOI:
https://doi.org/10.5566/ias.3055Keywords:
Deep learning, Image segmentation, Optical coherence tomography, U-NetAbstract
Optical coherence tomography (OCT) is characterized by high resolution and noninvasiveness; thus, it has been widely used to analyze skin tissues in recent years. Previous studies have evaluated skin OCT images using traditional algorithms with low accuracy for complex tissue structure images. Although a few studies have used deep learning methods to assess tissue structure in OCT images, they lack quantitative assess-ment of deeper skin tissue thickness and are limited to the epidermal layer. Thus, in the present study, we proposed an automated segmentation and quantitative evaluation method. The skin OCT images were first pre-processed, and the attention mechanism was added to U-Net based on transfer learning to segment the images and quantify the thickness of mouse skin tissue structure. The results showed that U-Net combined with the coordinate attention (CA) mechanism had better segmentation performance with 93.94% mean intersection of union (MIoU) value and 96.99% Dice similarity coefficient; the segmentation errors were 0.6 μm, 2.2 μm, 3.8 μm, and 6.0 μm for the epidermis layer, subcutaneous fat layer, muscle fascia layer, and the overall skin tissue structure of mice, respectively. The overall skin tissue thickness of the four mice were 235 ± 20 μm, 264 ± 42 μm, 275 ± 40 μm, and 774 ± 91 μm, respectively. The present study provides a rapid and accurate method for the automated measurement of skin tissue thickness.
References
Abubakar A, Ugail H, Bukar AM (2020). Assessment of human skin burns: A deep transfer learning approach. J Med Biol Eng 40: 321-33.
Boone M, Marneffe A, Suppa M, Miyamoto M, Alarcon I, Wellenhof RH, Malvehy J, Pellacani G, Marmol VD (2015). High-definition optical coherence tomography algorithm for the discrimi-nation of actinic keratosis from normal skin and from squamous cell carcinoma. J Eur Acad Dermatol 29: 1606-15.
Chen X, Cai Q, Ma N, Li HS (2023). ChroSegNet: An attention-based model for chromosome segmenta-tion with enhanced processing. Appl Sci 13: 2308.
Cobb MJ, Chen Y, Underwood RA, Usui ML, Olerud J, Li X (2006). Marmol Noninvasive assessment of cutaneous wound healing using ultrahigh-resolution optical coherence tomography. J Biomed Opt 11: 064002.
Gao TX, Liu S, Gao E, Wang AC, Tang XY, Fan YW (2022). Automatic segmentation of laser-induced injury OCT images based on a deep neural network model. Int J Mol Sci 23: 11079-91.
Gao W, Zakharov VP, Myakinin O, Bratchenko IA, Artemyev DN, Kornilin DV (2016). Medical images classification for skin cancer using quantitative image features with optical coherence tomography. J Innov Opt Heal Sci 9: 1650003.
Geoffrey H, Vinyals O, Dean J (2015). Distilling the Knowledge in a Neural Network. arXiv:1503.02531.
Gong P, Cheng L, Zhang Z, Meng A, Li E, Chen J, Zhang L (2023). Multi-omics integration method based on attention deep learning network for biomedical data classification. Comput Meth Prog Bio 231: 107377.
Guo MH, Xu TX, Liu J, Liu ZN, Jiang PT, Mu TJ, Zhang SH, Martin R, Chen M, Hu SM (2022). Attention mechanisms in computer vision: A survey. Comput Vis Media 8: 331-68.
Hou QB, Zhou D, Feng J. Coordinate Attention for Efficient Mobile Network Design (2021). in Proc. IEEE Conf. Comput. Vis. Pattern Recognit. (CVPR) 13708-17.
Huang D, Swanson EA, Lin CP, Stinson WG, Chang W, Hee GR, Flotte T, Gregory K, Puliafito CA (1991). Optical coherence tomography. Science 254: 1178-81.
Ji Y, Yang S, Zhou K, Rocliffe H, Pellicoro A, Cash JL, Wang R, Li C, Huang Z (2022). Deep-learning approach for automated thickness measurement of epithelial tissue and scab using optical coherence tomography. J Biomed Opt 27: 015002.
Kepp T, Droigk C, Casper M, Evers M, Hüttmann G, Salma N, Manstein D, Heinrich MP, Handels H (2019). Segmentation of mouse skin layers in optical coherence tomography image data using deep convolutional neural networks. Biomed Opt Express 10: 3484-96.
Kim KH, Pierce MC, Maguluri GN, Park BH, Yoon SJ, Lydon M, Sheridan R, Boer JF (2012). In vivo imaging of human burn injuries with polarization-sensitive optical coherence tomography. J Biomed Opt 17: 066012.
Li X, Jiang Y, Li M, Yin S (2020). Lightweight attention convolutional neural network for retinal vessel image segmentation. IEEE Trans Industr Inform 17: 1958-67.
Pan L, Chen X (2023). Retinal OCT image registration: Methods and applications. IEEE Rev Biomed Eng 16: 307-18.
Pathania YS, Apalla Z, Salerni G, Patil A, Grabbe S, Goldust M (2022). Non-invasive diagnostic techniques in pigmentary skin disorders and skin cancer. J Cosmet Dermatol 21: 444-50.
Sacha NS, Arnaud M, Celia M, Stephanie B, Sandra L, Jean PC (2020). Swept-source and spectral domain OCT imaging of conjunctival tumors. Ophthalmol-ogy 128: 947-50.
Shahoveisi F, Gorji HT, Shahabi S, Hosseinirad S, Markell S, Markell S, Vasefi F (2023). Application of image processing and transfer learning for the detection of rust disease. Sci Rep 13: 5133.
Tadrous PT (2000). Methods for imaging the structure and function of living tissues and cells: optical coherence tomography. J Pathology 191: 115-9.
Woo SH, Park JC, Lee JY, Kweon IS (2018). CBAM: Convolutional Block Attention Module. arXiv: 1807.06521
Xu Z, Kui CZ, Jing G, Wei H, Peng L, Zhang JN (2022). A two-stage deep transfer learning model and its application for medical image processing in traditional chinese medicine. Knowl Based Syst 239: 108060.
Zhuang FZ, Qi ZY, Duan KY, Xi DB, Zhu Y, Zhu HS. Xiong H, He Q (2021). A comprehensive survey on transfer learning. P IEEE 109: 43-76.
Downloads
Published
Data Availability Statement
The data that support the findings of this study are available on request from the corresponding author, upon reasonable request.
Issue
Section
License
Copyright (c) 2024 Changke Wang, Qiong Ma, Yu Wei, Nan Yu, YuQing Wang, Qingyu Cai, Haiyang Sun, Hongxiang Kang
This work is licensed under a Creative Commons Attribution 4.0 International License.