CMT-CNN: colposcopic multimodal temporal hybrid deep learning model to detect cervical intraepithelial neoplasia

(1) * Lalasa Mukku Mail (CHRIST (Deemed to be University), India)
(2) Jyothi Thomas Mail (CHRIST (Deemed to be University), India)
*corresponding author

Abstract


Cervical cancer poses a significant threat to women's health in developing countries, necessitating effective early detection methods. In this study, we introduce the Colposcopic Multimodal Temporal Convolution Neural Network (CMT-CNN), a novel model designed for classifying cervical intraepithelial neoplasia by leveraging sequential colposcope images and integrating extracted features with clinical data. Our approach incorporates Mask R-CNN for precise cervix region segmentation and deploys the EfficientNet B7 architecture to extract features from saline, iodine, and acetic acid images. The fusion of clinical data at the decision level, coupled with Atrous Spatial Pyramid Pooling-based classification, yields remarkable results: an accuracy of 92.31%, precision of 90.19%, recall of 89.63%, and an F-1 score of 90.72. This achievement not only establishes the superiority of the CMT-CNN model over baselines but also paves the way for future research endeavours aiming to harness heterogeneous data types in the development of deep learning models for cervical cancer screening. The implications of this work are profound, offering a potent tool for early cervical cancer detection that combines multimodal data and clinical insights, potentially saving countless lives.

Keywords


Cervical cancer; EfficientNet; Attention mechanism; Deep learning; Specular reflections

   

DOI

https://doi.org/10.26555/ijain.v10i2.1527
      

Article metrics

Abstract views : 428 | PDF views : 99

   

Cite

   

Full Text

Download

References


[1] A. Znaor, A. Ryzhov, M. Corbex, M. Piñeros, and F. Bray, “Cervical cancer in the Newly Independent States of the former Soviet Union: incidence will remain high without action,” Cancer Epidemiol., vol. 73, p. 2, 2021, doi: 10.1016/j.canep.2021.101944.

[2] H. Sung et al., “Global Cancer Statistics 2020: GLOBOCAN Estimates of Incidence and Mortality Worldwide for 36 Cancers in 185 Countries,” CA. Cancer J. Clin., vol. 71, no. 3, pp. 209–249, May 2021, doi: 10.3322/caac.21660.

[3] A. B. Sravani, V. Ghate, and S. Lewis, “Human papillomavirus infection, cervical cancer and the less explored role of trace elements,” Biol. Trace Elem. Res., vol. 201, no. 3, pp. 1026–1050, 2023, doi: 10.1007/s12011-022-03226-2.

[4] W. Ahmed et al., “Role of human Papillomavirus in various cancers: epidemiology, screening and prevention,” Mini Rev. Med. Chem., vol. 23, no. 10, pp. 1079–1089, 2023, doi: 10.2174/1389557523666230213140641.

[5] M. M. Kalbhor and S. V. Shinde, “Cervical cancer diagnosis using convolution neural network: feature learning and transfer learning approaches,” Soft Comput., pp. 1–11, Jul. 2023, doi: 10.1007/s00500-023-08969-1.

[6] M. Gultekin, P. T. Ramirez, N. Broutet, and R. Hutubessy, “World Health Organization call for action to eliminate cervical cancer globally,” Int. J. Gynecol. Cancer, vol. 30, no. 4, pp. 426–427, Apr. 2020, doi: 10.1136/ijgc-2020-001285.

[7] A. Srinath, F. van Merode, S. V. Rao, and M. Pavlova, “Barriers to cervical cancer and breast cancer screening uptake in low- and middle-income countries: a systematic review,” Health Policy Plan., vol. 38, no. 4, pp. 509–527, Apr. 2023, doi: 10.1093/heapol/czac104.

[8] L. Mukku and J. Thomas, “A machine learning model to predict suicidal tendencies in students,” Asian J. Psychiatr., vol. 79, p. 103363, 2023, doi: 10.1016/j.ajp.2022.103363.

[9] X. Hou, G. Shen, L. Zhou, Y. Li, T. Wang, and X. Ma, “Artificial Intelligence in Cervical Cancer Screening and Diagnosis.,” Front. Oncol., vol. 12, p. 851367, 2022, doi: 10.3389/fonc.2022.851367.

[10] B. Hunter, S. Hindocha, and R. W. Lee, “The Role of Artificial Intelligence in Early Cancer Diagnosis,” Cancers (Basel)., vol. 14, no. 6, p. 1524, Mar. 2022, doi: 10.3390/cancers14061524.

[11] Y. Kumar, S. Gupta, R. Singla, and Y.-C. Hu, “A Systematic Review of Artificial Intelligence Techniques in Cancer Prediction and Diagnosis,” Arch. Comput. Methods Eng., vol. 29, no. 4, pp. 2043–2070, Jun. 2022, doi: 10.1007/s11831-021-09648-w.

[12] L. Allahqoli et al., “Diagnosis of cervical cancer and pre-cancerous lesions by artificial intelligence: a systematic review,” Diagnostics, vol. 12, no. 11, p. 2771, 2022, doi: 10.3390/diagnostics12112771.

[13] C. Liu et al., “Artificial intelligence in cervical cancer research and applications,” Acadlore Trans. AI Mach. Learn., vol. 2, no. 2, pp. 99–115, 2023, doi: 10.56578/ataiml020205.

[14] Y. Ming, X. Dong, J. Zhao, Z. Chen, H. Wang, and N. Wu, “Deep learning-based multimodal image analysis for cervical cancer detection,” Methods, vol. 205, pp. 46–52, 2022, doi: 10.1016/j.ymeth.2022.05.004.

[15] C. Yang, L. Qin, Y. Xie, and J. Liao, “Deep learning in CT image segmentation of cervical cancer: a systematic review and meta-analysis,” Radiat. Oncol., vol. 17, no. 1, p. 175, Nov. 2022, doi: 10.1186/s13014-022-02148-6.

[16] Y. Singh, D. Srivastava, P. S. Chandranand, and D. S. Singh, “Algorithms for screening of Cervical Cancer: A chronological review,” Mach. Learn. arXiv, p. 10, Nov. 2018. [Online]. Available at: https://arxiv.org/abs/1811.00849v1.

[17] M. Lalasa and J. Thomas, “A Review of Deep Learning Methods in Cervical Cancer Detection,” in International Conference on Soft Computing and Pattern Recognition, 2022, pp. 624–633, doi: 10.1007/978-3-031-27524-1_60.

[18] X. Chen et al., “Application of EfficientNet‐B0 and GRU‐based deep learning on classifying the colposcopy diagnosis of precancerous cervical lesions,” Cancer Med., vol. 12, no. 7, pp. 8690–8699, 2023, doi: 10.1002/cam4.5581.

[19] R. Perkins et al., “Comparison of accuracy and reproducibility of colposcopic impression based on a single image versus a two-minute time series of colposcopic images,” Gynecol. Oncol., vol. 167, no. 1, pp. 89–95, Oct. 2022, doi: 10.1016/j.ygyno.2022.08.001.

[20] Y. Fan, H. Ma, Y. Fu, X. Liang, H. Yu, and Y. Liu, “Colposcopic multimodal fusion for the classification of cervical lesions.,” Phys. Med. Biol., vol. 67, no. 13, Jun. 2022, doi: 10.1088/1361-6560/ac73d4.

[21] J. Kim, C. M. Park, S. Y. Kim, and A. Cho, “Convolutional neural network-based classification of cervical intraepithelial neoplasias using colposcopic image segmentation for acetowhite epithelium,” Sci. Rep., vol. 12, no. 1, p. 17228, Oct. 2022, doi: 10.1038/s41598-022-21692-5.

[22] G. Peng, H. Dong, T. Liang, L. Li, and J. Liu, “Diagnosis of cervical precancerous lesions based on multimodal feature changes,” Comput. Biol. Med., vol. 130, p. 104209, Mar. 2021, doi: 10.1016/j.compbiomed.2021.104209.

[23] L. Yan et al., “Multi-state colposcopy image fusion for cervical precancerous lesion diagnosis using BF-CNN,” Biomed. Signal Process. Control, vol. 68, p. 102700, Jul. 2021, doi: 10.1016/j.bspc.2021.102700.

[24] Y. Cao et al., “A deep learning-based method for cervical transformation zone classification in colposcopy images,” Technol. Heal. Care, no. Preprint, pp. 1–12, 2022, doi: 10.3233/THC-220141.

[25] M. N. Asiedu et al., “Development of Algorithms for Automated Detection of Cervical Pre-Cancers With a Low-Cost, Point-of-Care, Pocket Colposcope,” IEEE Trans. Biomed. Eng., vol. 66, no. 8, pp. 2306–2318, Aug. 2019, doi: 10.1109/TBME.2018.2887208.

[26] S. Y. Park, D. Sargent, R. Lieberman, and U. Gustafsson, “Domain-Specific Image Analysis for Cervical Neoplasia Detection Based on Conditional Random Fields,” IEEE Trans. Med. Imaging, vol. 30, no. 3, pp. 867–878, Mar. 2011, doi: 10.1109/TMI.2011.2106796.

[27] T. Xu et al., “Multi-feature based benchmark for cervical dysplasia classification evaluation,” Pattern Recognit., vol. 63, pp. 468–475, Mar. 2017, doi: 10.1016/j.patcog.2016.09.027.

[28] T. Chen et al., “Multi-Modal Fusion Learning For Cervical Dysplasia Diagnosis College of Computer Science and Technology Real Doctor AI Research Centre University of Notre Dame Department of Computer Science and Engineering Department of Gynecologic Oncology , Women ’ s H,” 2019 IEEE 16th Int. Symp. Biomed. Imaging (ISBI 2019), no. Isbi, pp. 1505–1509, 2019, doi: 10.1109/ISBI.2019.8759303.

[29] W. Li, S. Venkataraman, U. Gustafsson, J. C. Oyama, D. G. Ferris, and R. W. Lieberman, “Using acetowhite opacity index for detecting cervical intraepithelial neoplasia,” J. Biomed. Opt., vol. 14, no. 1, p. 014020, 2009, doi: 10.1117/1.3079810.

[30] S. Young Park et al., “Automated image analysis of digital colposcopy for the detection of cervical neoplasia,” J. Biomed. Opt., vol. 13, no. 1, p. 014029, Jan. 2008, doi: 10.1117/1.2830654.

[31] Y. Li et al., “Computer-Aided Cervical Cancer Diagnosis Using Time-Lapsed Colposcopic Images,” IEEE Trans. Med. Imaging, vol. 39, no. 11, pp. 3403–3415, Nov. 2020, doi: 10.1109/TMI.2020.2994778.

[32] H. Yu et al., “Segmentation of the cervical lesion region in colposcopic images based on deep learning,” Front. Oncol., vol. 12, p. 952847, Aug. 2022, doi: 10.3389/fonc.2022.952847.

[33] S. Aggarwal, A. K. Sahoo, C. Bansal, and P. K. Sarangi, “Image Classification using Deep Learning: A Comparative Study of VGG-16, InceptionV3 and EfficientNet B7 Models,” in 2023 3rd International Conference on Advance Computing and Innovative Technologies in Engineering (ICACITE), 2023, pp. 1728–1732, doi: 10.1109/ICACITE57410.2023.10183255.

[34] Y. Qiu, Y. Liu, Y. Chen, J. Zhang, J. Zhu, and J. Xu, “A2SPPNet: attentive atrous spatial pyramid pooling network for salient object detection,” IEEE Trans. Multimed., 2022, doi: 10.1109/TMM.2022.3141933.

[35] M. Patacchiola, J. Bronskill, A. Shysheya, K. Hofmann, S. Nowozin, and R. Turner, “Contextual squeeze-and-excitation for efficient few-shot image classification,” Adv. Neural Inf. Process. Syst., vol. 35, pp. 36680–36692, 2022. [Online]. Available at: https://arxiv.org/abs/2206.09843.

[36] D.-F. Shen, J.-J. Guo, G.-S. Lin, and J.-Y. Lin, “Content-aware specular reflection suppression based on adaptive image inpainting and neural network for endoscopic images,” Comput. Methods Programs Biomed., vol. 192, p. 105414, Aug. 2020, doi: 10.1016/j.cmpb.2020.105414.

[37] L. Li, X. Li, S. Yang, S. Ding, A. Jolfaei, and X. Zheng, “Unsupervised-Learning-Based Continuous Depth and Motion Estimation With Monocular Endoscopy for Virtual Reality Minimally Invasive Surgery,” IEEE Trans. Ind. Informatics, vol. 17, no. 6, pp. 3920–3928, Jun. 2021, doi: 10.1109/TII.2020.3011067.

[38] C. P. N. Khuong et al., “Rapid and efficient characterization of cervical collagen orientation using linearly polarized colposcopic images,” J. Innov. Opt. Health Sci., p. 2241001, 2022, doi: 10.1142/S1793545822410012.

[39] A. D. Magaraja et al., “A Hybrid Linear Iterative Clustering and Bayes Classification-Based GrabCut Segmentation Scheme for Dynamic Detection of Cervical Cancer,” Applied Sciences (Switzerland), vol. 12, no. 20. p. 14, 2022, doi: 10.3390/app122010522.

[40] A. M. Ikotun, A. E. Ezugwu, L. Abualigah, B. Abuhaija, and J. Heming, “K-means clustering algorithms: A comprehensive review, variants analysis, and advances in the era of big data,” Inf. Sci. (Ny)., 2022, doi: 10.1016/j.ins.2022.11.139.

[41] Z. YANG, X. PENG, Q. ZHU, and Z. YIN, “Image segmentation algorithm with adaptive attention mechanism based on deeplab v3 plus,” J. Comput. Appl., vol. 42, no. 1, p. 230, 2022. [Online]. Available at: http://www.joca.cn/EN/10.11772/j.issn.1001-9081.2021010137.

[42] P. Bharati and A. Pramanik, “Deep learning techniques—R-CNN to mask R-CNN: a survey,” Comput. Intell. Pattern Recognit., pp. 657–668, 2020, doi: 10.1007/978-981-13-9042-5_56.

[43] T. Cheng, X. Wang, L. Huang, and W. Liu, “Boundary-preserving mask r-cnn,” in European conference on computer vision, 2020, pp. 660–676, doi: 10.1007/978-3-030-58568-6_39.

[44] J. Shang, K. Zhang, Z. Zhang, C. Li, and H. Liu, “A high-performance convolution block oriented accelerator for MBConv-Based CNNs,” Integration, vol. 88, pp. 298–312, 2023, doi: 10.1016/j.vlsi.2022.10.012.

[45] J. Hu, L. Shen, and G. Sun, “Squeeze-and-excitation networks,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2018, pp. 7132–7141, doi: 10.1109/CVPR.2018.00745.

[46] M. A. Morid, A. Borjali, and G. Del Fiol, “A scoping review of transfer learning research on medical image analysis using ImageNet,” Computers in Biology and Medicine, vol. 128. p. 39, 2021, doi: 10.1016/j.compbiomed.2020.104115.

[47] S. K. Saini, V. Bansal, R. Kaur, and M. Juneja, “ColpoNet for automated cervical cancer screening using colposcopy images,” Mach. Vis. Appl., vol. 31, no. 3, p. 15, 2020, doi: 10.1007/s00138-020-01063-8.

[48] L. Liu et al., “Computer-aided diagnostic system based on deep learning for classifying colposcopy images,” Ann. Transl. Med., vol. 9, no. 13, pp. 1045–1045, Jul. 2021, doi: 10.21037/atm-21-885.

[49] C. Buiu, V.-R. Dănăilă, and C. N. Răduţă, “MobileNetV2 ensemble for cervical precancerous lesions classification,” Processes, vol. 8, no. 5, p. 595, 2020, doi: 10.3390/pr8050595.

[50] C. Yuan et al., “The application of deep learning based diagnostic system to cervical squamous intraepithelial lesions recognition in colposcopy images,” Sci. Rep., pp. 1–12, 2020, doi: 10.1038/s41598-020-68252-3.




Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

___________________________________________________________
International Journal of Advances in Intelligent Informatics
ISSN 2442-6571  (print) | 2548-3161 (online)
Organized by UAD and ASCEE Computer Society
Published by Universitas Ahmad Dahlan
W: http://ijain.org
E: info@ijain.org (paper handling issues)
   andri.pranolo.id@ieee.org (publication issues)

View IJAIN Stats

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0