(2) Abraham Eseoghene Evwiekpaefe (Department of Computer Science, Nigeria Defence Academy, Niger)
(3) Martins Ekata Irhebhude (Department of Computer Science, Nigeria Defence Academy, Niger)
*corresponding author
AbstractDespite tremendous advancements in gender equality, there are still persistent gender disparities, especially in important human activities. Consequently, gender inequality and related concerns are serious problems in our global society. Major players in the global economy have identified the gender identity system as a crucial stepping stone for bridging the enormous gap in gender-based problems. Extensive research conducted by forensic scientists has uncovered a unique pattern in the fingerprint, and these distinguishing characteristics of fingerprints can be utilized to determine the gender of individuals. Numerous research has revealed various fingerprint-based approaches to gender recognition. This research aims to present a novel dynamic horizontal voting ensemble model with a hybrid Convolutional Neural Network and Long Short Term Memory (CNN-LSTM) deep learning algorithm as the base learner to determine human gender attributes based on fingerprint patterns automatically. More than four thousand Live fingerprint images were acquired and subjected to training, testing, and classification using the proposed model. The results of this study indicated over 99% accuracy in predicting a person’s gender. The proposed model also performed better than other state-of-the-art models, such as ResNet-34, VGG-19, ResNet-50, and EfficientNet-B3, when implemented on the SOCOFing public dataset.
KeywordsDeep learning; fingerprint pattern; classification; gender; soft biometric
|
DOIhttps://doi.org/10.26555/ijain.v8i3.927 |
Article metricsAbstract views : 704 | PDF views : 169 |
Cite |
Full TextDownload |
References
[1] H. Barr, “List of Taliban Policies Violating Women’s Rights in Afghanistan | Human Rights Watch.” (accessed Jan. 02, 2020). Available: https://www.hrw.org/news/2021/09/29/list-taliban-policies-violating-womens-rights-afghanistan.
[2] Mahmood, F. G. Forero, J. Jensson, S. K. Davidar, D. Abundo, M. L. Brixi, H. Kucey, and Andrea “Partnering for Gender Equality : Umbrella Facility for Gender Equality - Annual Report 2021.” (accessed oct. 21, 2022). Available: https://documents.worldbank.org/en/publication/documents-reports/documentdetail/227871639542714919/partnering-for-gender-equality-umbrella-facility-for-gender-equality-annual-report-2021.
[3] M. A. Alsmirat, F. Al-Alem, M. Al-Ayyoub, Y. Jararweh, and B. Gupta, “Impact of digital fingerprint image quality on the fingerprint recognition accuracy,” Multimed. Tools Appl., vol. 78, no. 3, pp. 3649–3688, Feb. 2019, doi: 10.1007/S11042-017-5537-5/METRICS.
[4] X. Yin, Y. Zhu, and J. Hu, “Contactless Fingerprint Recognition Based on Global Minutia Topology and Loose Genetic Algorithm,” IEEE Trans. Inf. Forensics Secur., vol. 15, no. 1, pp. 28–41, 2020, doi: 10.1109/TIFS.2019.2918083.
[5] Y. Xu, G. Lu, Y. Lu, and D. Zhang, “High resolution fingerprint recognition using pore and edge descriptors,” Pattern Recognit. Lett., vol. 125, pp. 773–779, Jul. 2019, doi: 10.1016/j.patrec.2019.08.006.
[6] R. Golwalkar and N. Mehendale, “Masked Face Recognition Using Deep Metric Learning and FaceMaskNet-21,” SSRN Electron. J., pp. 1–8, Nov. 2020, doi: 10.2139/ssrn.3731223.
[7] M. E. Irhebhude, A. O. Kolawole, and H. K. Goma, “A Gender Recognition System Using Facial Images with High Dimensional Data,” Malaysian J. Appl. Sci., vol. 6, no. 1, pp. 27–45, Apr. 2021, doi: 10.37231/myjas.2021.6.1.275.
[8] H. B. Kekre and V. A. Bharadi, “Finger-Knuckle-Print verification using Kekre’s wavelet transform,” in Proceedings of the International Conference & Workshop on Emerging Trends in Technology - ICWET ’11, 2011, p. 32, doi: 10.1145/1980022.1980030.
[9] Y. Faridah, H. Nasir, A. K. Kushsairy, S. I. Safie, S. Khan, and T. S. Gunawan, “Fingerprint Biometric Systems,” Trends Bioinforma., vol. 9, no. 2, pp. 52–58, Sep. 2016, doi: 10.3923/tb.2016.52.58.
[10] Y. I. Shehu, A. Ruiz-Garcia, V. Palade, and A. James, “Detailed Identification of Fingerprints Using Convolutional Neural Networks,” in 2018 17th IEEE International Conference on Machine Learning and Applications (ICMLA), Dec. 2018, pp. 1161–1165, doi: 10.1109/ICMLA.2018.00187.
[11] Z. Ezzati Khatab, A. Hajihoseini Gazestani, S. A. Ghorashi, and M. Ghavami, “A fingerprint technique for indoor localization using autoencoder based semi-supervised deep extreme learning machine,” Signal Processing, vol. 181, p. 107915, Apr. 2021, doi: 10.1016/j.sigpro.2020.107915.
[12] S. Kloppenburg and I. van der Ploeg, “Securing Identities: Biometric Technologies and the Enactment of Human Bodily Differences,” Sci. Cult. (Lond)., vol. 29, no. 1, pp. 57–76, Jan. 2020, doi: 10.1080/09505431.2018.1519534.
[13] B. Hassan, E. Izquierdo, and T. Piatrik, “Soft biometrics: a survey,” Multimed. Tools Appl., pp. 1–44, Mar. 2021, doi: 10.1007/s11042-021-10622-8.
[14] B. Rim, J. Kim, and M. Hong, “Gender Classification from Fingerprint-images using Deep Learning Approach,” in Proceedings of the International Conference on Research in Adaptive and Convergent Systems, Oct. 2020, pp. 7–12, doi: 10.1145/3400286.3418237.
[15] R. Yamashita, M. Nishio, R. K. G. Do, and K. Togashi, “Convolutional neural networks: an overview and application in radiology,” Insights Imaging, vol. 9, no. 4, pp. 611–629, Aug. 2018, doi: 10.1007/S13244-018-0639-9/FIGURES/15.
[16] L. Zhang et al., “Evaluation and Implementation of Convolutional Neural Networks in Image Recognition,” J. Phys. Conf. Ser., vol. 1087, no. 6, p. 062018, Sep. 2018, doi: 10.1088/1742-6596/1087/6/062018.
[17] C. Tang, Q. Zhu, W. Wu, W. Huang, C. Hong, and X. Niu, “PLANET: Improved Convolutional Neural Networks with Image Enhancement for Image Classification,” Math. Probl. Eng., vol. 2020, pp. 1–10, Mar. 2020, doi: 10.1155/2020/1245924.
[18] L. Alzubaidi et al., “Review of deep learning: concepts, CNN architectures, challenges, applications, future directions,” J. Big Data, vol. 8, no. 1, p. 53, Dec. 2021, doi: 10.1186/s40537-021-00444-8.
[19] J. Kim, B. Rim, N. J. Sung, and M. Hong, “Left or Right Hand Classification from Fingerprint Images Using a Deep Neural Network,” Comput. Mater. Contin., vol. 63, no. 1, pp. 17–30, Mar. 2020, doi: 10.32604/CMC.2020.09044.
[20] S. Jalali, R. Boostani, and M. Mohammadi, “Efficient fingerprint features for gender recognition,” Multidimens. Syst. Signal Process., vol. 33, no. 1, pp. 81–97, Mar. 2022, doi: 10.1007/S11045-021-00789-6/METRICS.
[21] A. Mishra and S. Jain Asst Professor, “A review on identification of gender using fingerprints,” Int. J. Health Sci. (Qassim)., vol. 6, no. S2, pp. 9624–9634, May 2022, doi: 10.53730/IJHS.V6NS2.7514.
[22] B. K. Oleiwi, L. H. Abood, and A. K. Farhan, “Integrated Different Fingerprint Identification and Classification Systems based Deep Learning,” in 2022 International Conference on Computer Science and Software Engineering (CSASE), Mar. 2022, pp. 188–193, doi: 10.1109/CSASE51777.2022.9759632.
[23] G. Singh, “Determination of Gender Differences from Fingerprints Ridge Density in Two Northern Indian Population of Chandigarh Region,” J. Forensic Res., vol. 03, no. 03, p. 3, 2012, doi: 10.4172/2157-7145.1000145.
[24] C. T. Hsiao, C. Y. Lin, P. S. Wang, and Y. Te Wu, “Application of Convolutional Neural Network for Fingerprint-Based Prediction of Gender, Finger Position, and Height,” Entropy, vol. 24, no. 4, p. 475, Apr. 2022, doi: 10.3390/E24040475/S1.
[25] P. Sudharshan Duth and M. P. Mirashi, “Fingerprint Based Gender Classification using ANN,” Int. J. Eng. Adv. Technol., vol. 8, no. 5, pp. 1779–1782, Jun. 2019, doi: 10.23883/IJRTER.2018.4099.CWM02.
[26] H. Chotimah, H. Susilo, M. Henie, and I. Al, “Fingerprint Analysis and Gender Predilection among medical Students of Nepal Medical College and teaching Hospital,” Int. J. Res. Rev., vol. 4, no. June, pp. 6–13, 2017, (accessed Jan. 02, 2020). Available: https://www.ijrrjournal.com/IJRR_Vol.4_Issue.7_July2017/Abstract_IJRR0010.html.
[27] P. Terhorst, N. Damer, A. Braun, and A. Kuijper, “Deep and Multi-Algorithmic Gender Classification of Single Fingerprint Minutiae,” in 2018 21st International Conference on Information Fusion (FUSION), Jul. 2018, pp. 2113–2120, doi: 10.23919/ICIF.2018.8455803.
[28] T. Nataraja Moorthy, S. Rajathi, and A. K. Sairah, “Gender determination from fingerprint patterns distribution among Malaysian tamils,” Int. J. Med. Toxicol. Leg. Med., vol. 22, no. 1–2, pp. 34–37, Jan. 2019, doi: 10.5958/0974-4614.2019.00009.3.
[29] O. N. Iloanusi and U. C. Ejiogu, “Gender classification from fused multi-fingerprint types,” Inf. Secur. J. A Glob. Perspect., vol. 29, no. 5, pp. 209–219, Sep. 2020, doi: 10.1080/19393555.2020.1741742.
[30] B. Pandya, G. Cosma, A. A. Alani, A. Taherkhani, V. Bharadi, and T. . McGinnity, “Fingerprint classification using a deep convolutional neural network,” in 2018 4th International Conference on Information Management (ICIM), May 2018, pp. 86–91, doi: 10.1109/INFOMAN.2018.8392815.
[31] J. Xiong et al., “Application of Histogram Equalization for Image Enhancement in Corrosion Areas,” Shock Vib., vol. 2021, pp. 1–13, Jan. 2021, doi: 10.1155/2021/8883571.
[32] A. Narayanan and S. K, “Gender Detection and Classification from Fingerprints Using Pixel Count,” SSRN Electron. J., Aug. 2019, doi: 10.2139/ssrn.3444032.
[33] Y. Qi, Y. Li, H. Lin, J. Chen, and H. Lei, “Research on Gender-related Fingerprint Features,” Aug. 2021, doi: 10.48550/arxiv.2108.08233.
[34] Y. I. Shehu, A. Ruiz-Garcia, V. Palade, and A. James, “Sokoto Coventry Fingerprint Dataset,” Jul. 2018, doi: 10.48550/arxiv.1807.10609.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
___________________________________________________________
International Journal of Advances in Intelligent Informatics
ISSN 2442-6571 (print) | 2548-3161 (online)
Organized by UAD and ASCEE Computer Society
Published by Universitas Ahmad Dahlan
W: http://ijain.org
E: info@ijain.org (paper handling issues)
andri.pranolo.id@ieee.org (publication issues)
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0