Enhanced feature clustering method based on ant colony optimization for feature selection

(1) Hassan Almazini Mail (Shatt Al-Arab University College, Iraq)
(2) Ku Ruhana Ku-Mahamud Mail (University Utara Malaysia, Kedah, Malaysia & Shibaura Institute of Technology, Tokyo,, Japan)
(3) * Hussein Fouad Almazini Mail (Shatt Al-Arab University College, Iraq)
*corresponding author


The popular modified graph clustering ant colony optimization (ACO) algorithm (MGCACO) performs feature selection (FS) by grouping highly correlated features. However, the MGCACO has problems in local search, thus limiting the search for optimal feature subset. Hence, an enhanced feature clustering with ant colony optimization (ECACO) algorithm is proposed. The improvement constructs an ACO feature clustering method to obtain clusters of highly correlated features. The ACO feature clustering method utilizes the ability of various mechanisms, such as local and global search to provide highly correlated features. The performance of ECACO was evaluated on six benchmark datasets from the University California Irvine (UCI) repository and two deoxyribonucleic acid microarray datasets, and its performance was compared against that of five benchmark metaheuristic algorithms. The classifiers used are random forest, k-nearest neighbors, decision tree, and support vector machine. Experimental results on the UCI dataset show the superior performance of ECACO compared with other algorithms in all classifiers in terms of classification accuracy. Experiments on the microarray datasets, in general, showed that the ECACO algorithm outperforms other algorithms in terms of average classification accuracy. ECACO can be utilized for FS in classification tasks for high-dimensionality datasets in various application domains such as medical diagnosis, biological classification, and health care systems.


Feature Clustering Correlated Features Local Search Classification Microarray




Article metrics

Abstract views : 333 | PDF views : 144




Full Text



[1] A. K. Shukla, D. Tripathi, B. R. Reddy, and D. Chandramohan, “A study on metaheuristics approaches for gene selection in microarray data: algorithms, applications and open challenges,” Evol. Intell., vol. 13, no. 3, pp. 309–329, Sep. 2020, doi: 10.1007/S12065-019-00306-6.

[2] M. Yuan, Z. Yang, and G. Ji, “Partial maximum correlation information: A new feature selection method for microarray data classification,” Neurocomputing, vol. 323, pp. 231–243, Jan. 2019, doi: 10.1016/J.NEUCOM.2018.09.084.

[3] P. Agrawal, H. F. Abutarboush, T. Ganesh, and A. W. Mohamed, “Metaheuristic algorithms on feature selection: A survey of one decade of research (2009-2019),” IEEE Access, vol. 9, pp. 26766–26791, 2021, doi: 10.1109/ACCESS.2021.3056407.

[4] Z. Manbari, F. Akhlaghian Tab, and C. Salavati, “Fast unsupervised feature selection based on the improved binary ant system and mutation strategy,” Neural Comput. Appl., vol. 31, no. 9, pp. 4963–4982, Sep. 2019, doi: 10.1007/S00521-018-03991-Z.

[5] B. Sahu, S. Dehuri, and A. Jagadev, “A Study on the Relevance of Feature Selection Methods in Microarray Data,” Open Bioinforma. J., vol. 11, no. 1, pp. 117–139, Aug. 2018, doi: 10.2174/1875036201811010117.

[6] P. Bugata and P. Drotar, “On some aspects of minimum redundancy maximum relevance feature selection,” Sci. China Inf. Sci., vol. 63, no. 1, pp. 1–15, Jan. 2020, doi: 10.1007/S11432-019-2633-y.

[7] M. Jiang, Y. Fang, Y. Su, G. Cai, and G. Han, “Random Subspace Ensemble with Enhanced Feature for Hyperspectral Image Classification,” IEEE Geosci. Remote Sens. Lett., vol. 17, no. 8, pp. 1373–1377, Aug. 2020, doi: 10.1109/LGRS.2019.2948960.

[8] W. Wang, Z. Yu, C. Fu, D. Cai, and X. He, “COP: customized correlation-based Filter level pruning method for deep CNN compression,” Neurocomputing, vol. 464, pp. 533–545, Nov. 2021, doi: 10.1016/J.NEUCOM.2021.08.098.

[9] L. Wang, S. Jiang, and S. Jiang, “A feature selection method via analysis of relevance, redundancy, and interaction,” Expert Syst. Appl., vol. 183, p. 115365, Nov. 2021, doi: 10.1016/J.ESWA.2021.115365.

[10] Z. Manbari, F. AkhlaghianTab, and C. Salavati, “Hybrid fast unsupervised feature selection for high-dimensional data,” Expert Syst. Appl., vol. 124, pp. 97–118, Jun. 2019, doi: 10.1016/J.ESWA.2019.01.016.

[11] H. Almazini and K. R. Ku-Mahamud, “Adaptive Technique for Feature Selection in Modified Graph Clustering-Based Ant Colony Optimization,” Int. J. Intell. Eng. Syst., vol. 14, no. 3, pp. 332–345, 2021, doi: 10.22266/ijies2021.0630.28.

[12] H. F. Almazini, K. R. Ku-Mahamud, and H. Almazini, “Heuristic Initialization Using Grey Wolf Optimizer Algorithm for Feature Selection in Intrusion Detection,” Int. J. Intell. Eng. Syst., vol. 16, no. 1, pp. 410–418, 2023, doi: 10.22266/ijies2023.0228.36.

[13] H. Li et al., “A Hybrid Feature Selection Algorithm Based on a Discrete Artificial Bee Colony for Parkinson’s Diagnosis,” ACM Trans. Internet Technol., vol. 21, no. 3, Jun. 2021, doi: 10.1145/3397161.

[14] M. Mafarja and S. Mirjalili, “Whale optimization approaches for wrapper feature selection,” Appl. Soft Comput., vol. 62, pp. 441–453, Jan. 2018, doi: 10.1016/J.ASOC.2017.11.006.

[15] M. Amoozegar and B. Minaei-Bidgoli, “Optimizing multi-objective PSO based feature selection method using a feature elitism mechanism,” Expert Syst. Appl., vol. 113, pp. 499–514, Dec. 2018, doi: 10.1016/J.ESWA.2018.07.013.

[16] H. F. Almazini, S. Mortada, H. F. A. Al-Mazini, H. N. K. Al-Behadili, and J. Alkenani, “Improved discrete plant propagation algorithm for solving the traveling salesman problem,” IAES Int. J. Artif. Intell., vol. 11, no. 1, pp. 13–22, Mar. 2022, doi: 10.11591/IJAI.V11.I1.PP13-22.

[17] T. Mostafaie, F. Modarres Khiyabani, and N. J. Navimipour, “A systematic study on meta-heuristic approaches for solving the graph coloring problem,” Comput. Oper. Res., vol. 120, p. 104850, Aug. 2020, doi: 10.1016/J.COR.2019.104850.

[18] A. K. Shukla, D. Tripathi, B. R. Reddy, and D. Chandramohan, “A study on metaheuristics approaches for gene selection in microarray data: algorithms, applications and open challenges,” Evol. Intell., vol. 13, no. 3, pp. 309–329, Sep. 2020, doi: 10.1007/S12065-019-00306-6

[19] H. Ghimatgar, K. Kazemi, M. S. Helfroush, and A. Aarabi, “An improved feature selection algorithm based on graph clustering and ant colony optimization,” Knowledge-Based Syst., vol. 159, pp. 270–285, Nov. 2018, doi: 10.1016/J.KNOSYS.2018.06.025.

[20] M. Mohammadi, M. Fazlali, and M. Hosseinzadeh, “Accelerating Louvain community detection algorithm on graphic processing unit,” J. Supercomput., vol. 77, no. 6, pp. 6056–6077, Jun. 2021, doi: 10.1007/S11227-020-03510-9.

[21] J. Zhang, Z. Ma, Q. Sun, and J. Yan, “Research Review on Algorithms of Community Detection in Complex Networks,” J. Phys. Conf. Ser., vol. 1069, no. 1, p. 012124, Aug. 2018, doi: 10.1088/1742-6596/1069/1/012124.

[22] I. BenMansour, “An Effective Hybrid Ant Colony Optimization for the Knapsack Problem Using Multi-Directional Search,” SN Comput. Sci., vol. 4, no. 2, pp. 1–15, Mar. 2023, doi: 10.1007/S42979-022-01564-5.

[23] A. E. Hassanien, Ed., “Machine Learning Paradigms: Theory and Application,” vol. 801, 2019, doi: 10.1007/978-3-030-02357-7.

[24] “Dua, D. and Graff, C. (2019) UCI Machine Learning Repository. University of California, School of Information and Computer Science, Irvine, CA. - References - Scientific Research Publishing.”. Available at: https://scirp.org/reference/referencespapers.aspx?referenceid=2607575 .

[25] R. Alanni, J. Hou, H. Azzawi, and Y. Xiang, “Deep gene selection method to select genes from microarray datasets for cancer classification,” BMC Bioinformatics, vol. 20, no. 1, pp. 1–15, Nov. 2019, doi: 10.1186/S12859-019-3161-2/FIGURES/5.

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

International Journal of Advances in Intelligent Informatics
ISSN 2442-6571  (print) | 2548-3161 (online)
Organized by UAD and ASCEE Computer Society
Published by Universitas Ahmad Dahlan
W: http://ijain.org
E: info@ijain.org (paper handling issues)
   andri.pranolo.id@ieee.org (publication issues)

View IJAIN Stats

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0