(2) * Amelia Ritahani Ismail (Department of Computer Science, International Islamic University Malaysia, Malaysia)
(3) Omar Abdelaziz Mohammad (Department of Computer Science, International Islamic University Malaysia, Malaysia)
*corresponding author
AbstractImbalanced class data is a common issue faced in classification tasks. Deep Belief Networks (DBN) is a promising deep learning algorithm when learning from complex feature input. However, when handling imbalanced class data, DBN encounters low performance as other machine learning algorithms. In this paper, the genetic algorithm (GA) and bootstrap sampling are incorporated into DBN to lessen the drawbacks occurs when imbalanced class datasets are used. The performance of the proposed algorithm is compared with DBN and is evaluated using performance metrics. The results showed that there is an improvement in performance when Evolutionary DBN with bootstrap sampling is used to handle imbalanced class datasets.
|
DOIhttps://doi.org/10.26555/ijain.v5i2.350 |
Article metricsAbstract views : 1392 | PDF views : 258 |
Cite |
Full TextDownload |
References
[1] G. M. Weiss and F. Provost, “The effect of class distribution on classifier learning: an empirical study,” Technical Report ML-TR-44, 2001, available at: https://rucore.libraries.rutgers.edu/rutgers-lib/59563/PDF/1/play/.
[2] J. Zhai, S. Zhang, and C. Wang, “The classification of imbalanced large data sets based on MapReduce and ensemble of ELM classifiers,” Int. J. Mach. Learn. Cybern., vol. 8, no. 3, pp. 1009–1017, Jun. 2017, doi: 10.1007/s13042-015-0478-7.
[3] H. Han, W.-Y. Wang, and B.-H. Mao, “Borderline-SMOTE: A New Over-Sampling Method in Imbalanced Data Sets Learning,” 2005, pp. 878–887, doi: 10.1007/11538059_91.
[4] Y. Liu, X. Yu, J. X. Huang, and A. An, “Combining integrated sampling with SVM ensembles for learning from imbalanced datasets,” Inf. Process. Manag., vol. 47, no. 4, pp. 617–631, Jul. 2011, doi: 10.1016/j.ipm.2010.11.007.
[5] T. Wang, D. J. Wu, A. Coates, and A. Y. Ng, “End-to-end text recognition with convolutional neural networks,” in Proceedings of the 21st International Conference on Pattern Recognition (ICPR2012), 2012, pp. 3304–3308, available at: https://ieeexplore.ieee.org/abstract/document/6460871.
[6] A. Mohamed, G. E. Dahl, and G. Hinton, “Acoustic Modeling Using Deep Belief Networks,” IEEE Trans. Audio. Speech. Lang. Processing, vol. 20, no. 1, pp. 14–22, Jan. 2012, doi: 10.1109/TASL.2011.2109382.
[7] D. Le and E. M. Provost, “Emotion recognition from spontaneous speech using Hidden Markov models with deep belief networks,” in 2013 IEEE Workshop on Automatic Speech Recognition and Understanding, 2013, pp. 216–221, doi: 10.1109/ASRU.2013.6707732.
[8] A. Mohamed, G. Hinton, and G. Penn, “Understanding how Deep Belief Networks perform acoustic modelling,” in 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2012, pp. 4273–4276, doi: 10.1109/ICASSP.2012.6288863.
[9] P. Hensman and D. Masko, “The impact of imbalanced training data for convolutional neural networks,” Degree Proj. Comput. Sci. KTH R. Inst. Technol., 2015, available at: https://www.kth.se/social/files/588617ebf2765401cfcc478c/PHensmanDMasko_dkand15.pdf.
[10] Y. Yan, M. Chen, M.-L. Shyu, and S.-C. Chen, “Deep Learning for Imbalanced Multimedia Data Classification,” in 2015 IEEE International Symposium on Multimedia (ISM), 2015, pp. 483–488, doi: 10.1109/ISM.2015.126.
[11] A. Fernández, S. García, and F. Herrera, “Addressing the Classification with Imbalanced Data: Open Problems and New Challenges on Class Distribution,” 2011, pp. 1–10, doi: 10.1007/978-3-642-21219-2_1.
[12] W. Liu and S. Chawla, “Class Confidence Weighted kNN Algorithms for Imbalanced Data Sets,” 2011, pp. 345–356, doi: 10.1007/978-3-642-20847-8_29.
[13] K. Swersky, B. Chen, B. Marlin, and N. de Freitas, “A tutorial on stochastic approximation algorithms for training Restricted Boltzmann Machines and Deep Belief Nets,” in 2010 Information Theory and Applications Workshop (ITA), 2010, pp. 1–10, doi: 10.1109/ITA.2010.5454138.
[14] N. V. Chawla, N. Japkowicz, and A. Kotcz, “Editorial: special issue on learning from imbalanced data sets,” ACM SIGKDD Explor. Newsl., vol. 6, no. 1, p. 1, Jun. 2004, doi: 10.1145/1007730.1007733.
[15] O. A. John, A. Adebayo, and O. Samuel, “Effect of Feature Ranking on the Detection of Credit Card Fraud: Comparative Evaluation of Four Techniques,” I-Manager’S J. Pattern Recognit., Vol. 5, No. 3, P. 10, 2018, doi: 10.26634/JPR.5.3.15676.
[16] J. Berry, I. Fasel, L. Fadiga, and D. Archangeli, “Entropy coding for training deep belief networks with imbalanced and unlabeled data,” J. Acoust. Soc. Am., vol. 131, no. 4, pp. 3235–3235, Apr. 2012, doi: 10.1121/1.4708066.
[17] O. Ghahabi and J. Hernando, “Deep belief networks for i-vector based speaker recognition,” in 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2014, pp. 1700–1704, doi: 10.1109/ICASSP.2014.6853888.
[18] M. Rosca, “Networks with emotions: An investigation into deep belief nets and emotion recognition,” Imperial College London, 2014, available at: http://elarosca.net/report.pdf.
[19] F. Megumi, A. Yamashita, M. Kawato, and H. Imamizu, “Functional MRI neurofeedback training on connectivity between two regions induces long-lasting changes in intrinsic functional network,” Front. Hum. Neurosci., vol. 9, Mar. 2015, doi: 10.3389/fnhum.2015.00160.
[20] Z. Sun, H. Sun, and J. Zhang, “Multistep Wind Speed and Wind Power Prediction Based on a Predictive Deep Belief Network and an Optimized Random Forest,” Math. Probl. Eng., vol. 2018, pp. 1–15, Jul. 2018, doi: 10.1155/2018/6231745.
[21] S. Dieleman, P. Brakel, and B. Schrauwen, “Audio-based music classification with a pretrained convolutional network,” in 12th International Society for Music Information Retrieval Conference (ISMIR-2011), 2011, pp. 669–674, available at: Google Scholar.
[22] H. Lee, R. Grosse, R. Ranganath, and A. Y. Ng, “Unsupervised learning of hierarchical representations with convolutional deep belief networks,” Commun. ACM, vol. 54, no. 10, pp. 95–103, 2011, doi: 10.1145/2001269.2001295.
[23] W.-L. Zheng, J.-Y. Zhu, Y. Peng, and B.-L. Lu, “EEG-based emotion classification using deep belief networks,” in 2014 IEEE International Conference on Multimedia and Expo (ICME), 2014, pp. 1–6, doi: 10.1109/ICME.2014.6890166.
[24] K. Terusaki and V. Stigliani, “Emotion Detection using Deep Belief Networks,” 2014, available at: https://pdfs.semanticscholar.org/e9bf/a43c8008632559ec5320420c6bfa438766c7.pdf.
[25] C. Huang, W. Gong, W. Fu, and D. Feng, “A Research of Speech Emotion Recognition Based on Deep Belief Network and SVM,” Math. Probl. Eng., vol. 2014, pp. 1–7, 2014, doi: 10.1155/2014/749604.
[26] G. E. Hinton, “Training Products of Experts by Minimizing Contrastive Divergence,” Neural Comput., vol. 14, no. 8, pp. 1771–1800, Aug. 2002, doi: 10.1162/089976602760128018.
[27] T. Jo and J.-H. Lee, “Latent Keyphrase Extraction Using Deep Belief Networks,” Int. J. Fuzzy Log. Intell. Syst., vol. 15, no. 3, pp. 153–158, Sep. 2015, doi: 10.5391/IJFIS.2015.15.3.153.
[28] A. A. Amri, A. R. Ismail, and A. Ahmad Zarir, “Convolutional Neural Networks and Deep Belief Networks for Analysing Imbalanced Class Issue in Handwritten Dataset,” Int. J. Adv. Sci. Eng. Inf. Technol., vol. 7, no. 6, p. 2302, Dec. 2017, doi: 10.18517/ijaseit.7.6.2632.
[29] Z. Lanbouri and S. Achchab, “A hybrid Deep belief network approach for Financial distress prediction,” in 2015 10th International Conference on Intelligent Systems: Theories and Applications (SITA), 2015, pp. 1–6, doi: 10.1109/SITA.2015.7358416.
[30] D. Kuang and L. He, “Classification on ADHD with Deep Learning,” in 2014 International Conference on Cloud Computing and Big Data, 2014, pp. 27–32, doi: 10.1109/CCBD.2014.42.
[31] S. Ganguly and D. Samajpati, “Distributed Generation Allocation on Radial Distribution Networks Under Uncertainties of Load and Generation Using Genetic Algorithm,” IEEE Trans. Sustain. Energy, vol. 6, no. 3, pp. 688–697, Jul. 2015, doi: 10.1109/TSTE.2015.2406915.
[32] M. Qiu, Z. Ming, J. Li, K. Gai, and Z. Zong, “Phase-Change Memory Optimization for Green Cloud with Genetic Algorithm,” IEEE Trans. Comput., vol. 64, no. 12, pp. 3528–3540, Dec. 2015, doi: 10.1109/TC.2015.2409857.
[33] P. Ghamisi and J. A. Benediktsson, “Feature selection based on hybridization of genetic algorithm and particle swarm optimization,” IEEE Geosci. Remote Sens. Lett., vol. 12, no. 2, pp. 309–313, 2014, doi: 10.1109/LGRS.2014.2337320.
[34] D. Liu, D. Niu, H. Wang, and L. Fan, “Short-term wind speed forecasting using wavelet transform and support vector machines optimized by genetic algorithm,” Renew. Energy, vol. 62, pp. 592–597, Feb. 2014, doi: 10.1016/j.renene.2013.08.011.
[35] M. Jamshidi, M. Ghaedi, K. Dashtian, S. Hajati, and A. Bazrafshan, “Ultrasound-assisted removal of Al 3+ ions and Alizarin red S by activated carbon engrafted with Ag nanoparticles: central composite design and genetic algorithm optimization,” RSC Adv., vol. 5, no. 73, pp. 59522–59532, 2015, doi: 10.1039/C5RA10981G.
[36] M. N. Haque, N. Noman, R. Berretta, and P. Moscato, “Heterogeneous Ensemble Combination Search Using Genetic Algorithm for Class Imbalanced Data Classification,” PLoS One, vol. 11, no. 1, p. e0146116, Jan. 2016, doi: 10.1371/journal.pone.0146116.
[37] M. Elhoseny, X. Yuan, Z. Yu, C. Mao, H. K. El-Minir, and A. M. Riad, “Balancing Energy Consumption in Heterogeneous Wireless Sensor Networks Using Genetic Algorithm,” IEEE Commun. Lett., vol. 19, no. 12, pp. 2194–2197, Dec. 2015, doi: 10.1109/LCOMM.2014.2381226.
[38] S. Oreski and G. Oreski, “Genetic algorithm-based heuristic for feature selection in credit risk assessment,” Expert Syst. Appl., vol. 41, no. 4, pp. 2052–2064, Mar. 2014, doi: 10.1016/j.eswa.2013.09.004.
[39] M. J. Neath, A. K. Swain, U. K. Madawala, and D. J. Thrimawithana, “An Optimal PID Controller for a Bidirectional Inductive Power Transfer System Using Multiobjective Genetic Algorithm,” IEEE Trans. Power Electron., vol. 29, no. 3, pp. 1523–1531, Mar. 2014, doi: 10.1109/TPEL.2013.2262953.
[40] H. Assodiky, I. Syarif, and T. Badriyah, “Deep learning algorithm for arrhythmia detection,” in 2017 International Electronics Symposium on Knowledge Creation and Intelligent Computing (IES-KCIC), 2017, pp. 26–32, doi: 10.1109/KCIC.2017.8228452.
[41] E. Inanlo and J. M. Zadeh, “Social Networks Classification using DBN Neural Network based on Genetic Algorithm,” Int. J. Innov. Adv. Comput. Sci., vol. 5, no. 11, pp. 7–-10, 2016, available at: Google Scholar.
[42] A. R. Deshmukh and S. P. Akarte, “Classification of Imbalanced Medical Data Efficiently Using Multiclass SVM and Genetic Algorithm,” Int. J. Comput. Sci. Mob. Comput., vol. 5, no. 1, pp. 144–-147, 2016, available at: Google Scholar.
[43] T. Perry, M. Bader-El-Den, and S. Cooper, “Imbalanced classification using genetically optimized cost sensitive classifiers,” in 2015 IEEE Congress on Evolutionary Computation (CEC), 2015, pp. 680–687, doi: 10.1109/CEC.2015.7256956.
[44] E. Asadi, M. G. da Silva, C. H. Antunes, L. Dias, and L. Glicksman, “Multi-objective optimization for building retrofit: A model using genetic algorithm and artificial neural network and an application,” Energy Build., vol. 81, pp. 444–456, Oct. 2014, doi: 10.1016/j.enbuild.2014.06.009.
[45] Q. Gu, L. Zhu, and Z. Cai, “Evaluation measures of the classification performance of imbalanced data sets,” in International symposium on intelligence computation and applications, 2009, pp. 461–471, doi: 10.1007/978-3-642-04962-0_53.
[46] C. Zhang, K. C. Tan, H. Li, and G. S. Hong, “A cost-sensitive deep belief network for imbalanced classification,” IEEE Trans. neural networks Learn. Syst., vol. 30, no. 1, pp. 109–122, 2018, doi: 10.1109/TNNLS.2018.2832648.
[47] S. Boughorbel, F. Jarray, and M. El-Anbari, “Optimal classifier for imbalanced data using Matthews Correlation Coefficient metric,” PLoS One, vol. 12, no. 6, p. e0177678, Jun. 2017, doi: 10.1371/journal.pone.0177678.
[48] V. López, A. Fernández, S. García, V. Palade, and F. Herrera, “An insight into classification with imbalanced data: Empirical results and current trends on using data intrinsic characteristics,” Inf. Sci. (Ny)., vol. 250, pp. 113–141, Nov. 2013, doi: 10.1016/j.ins.2013.07.007.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
___________________________________________________________
International Journal of Advances in Intelligent Informatics
ISSN 2442-6571 (print) | 2548-3161 (online)
Organized by UAD and ASCEE Computer Society
Published by Universitas Ahmad Dahlan
W: http://ijain.org
E: info@ijain.org (paper handling issues)
andri.pranolo.id@ieee.org (publication issues)
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0