Multi-objective clustering algorithm using particle swarm optimization with crowding distance (MCPSO-CD)

(1) Alwatben Batoul Rashed Mail (Department of Information Technology, Qassim University, Saudi Arabia)
(2) * Hazlina Hamdan Mail (Faculty of Computer Science and Information Technology, Universiti Putra Malaysia, Malaysia)
(3) Nurfadhlina Mohd Sharef Mail (Faculty of Computer Science and Information Technology, Universiti Putra Malaysia, Malaysia)
(4) Md Nasir Sulaiman Mail (Faculty of Computer Science and Information Technology, Universiti Putra Malaysia, Malaysia)
(5) Razali Yaakob Mail (Faculty of Computer Science and Information Technology, Universiti Putra Malaysia, Malaysia)
(6) Mansir Abubakar Mail (Faculty of Computer Science and Information Technology, Universiti Putra Malaysia, Malaysia)
*corresponding author

Abstract


Clustering, an unsupervised method of grouping sets of data, is used as a solution technique in various fields to divide and restructure data to become more significant and transform them into more useful information. Generally, clustering is difficult and complex phenomenon, where the appropriate numbers of clusters are always unknown, comes with a large number of potential solutions, and as well the datasets are unsupervised. These problems can be addressed by the Multi-Objective Particle Swarm Optimization (MOPSO) approach, which is commonly used in addressing optimization problems. However, MOPSO algorithm produces a group of non-dominated solutions which make the selection of an “appropriate” Pareto optimal or non-dominated solution more difficult. According to the literature, crowding distance is one of the most efficient algorithms that was developed based on density measures to treat the problem of selection mechanism for archive updates. In an attempt to address this problem, the clustering-based method that utilizes crowding distance (CD) technique to balance the optimality of the objectives in Pareto optimal solution search is proposed. The approach is based on the dominance concept and crowding distances mechanism to guarantee survival of the best solution. Furthermore, we used the Pareto dominance concept after calculating the value of crowding degree for each solution. The proposed method was evaluated against five clustering approaches that have succeeded in optimization that comprises of K-means Clustering, MCPSO, IMCPSO, Spectral clustering, Birch, and average-link algorithms. The results of the evaluation show that the proposed approach exemplified the state-of-the-art method with significant differences in most of the datasets tested.

   

DOI

https://doi.org/10.26555/ijain.v6i1.366
      

Article metrics

Abstract views : 1259 | PDF views : 254

   

Cite

   

Full Text

Download

References


[1] W. S. B. Alasadi, Suad A., “Review of Data Preprocessing Techniques in Data Mining,” J. Eng. Appl. Sci., vol. 12, no. 16, p. 4102–4107., 2017, available at: Google Scholar.

[2] L. M. Abualigah, A. T. Khader, and E. S. Hanandeh, “A new feature selection method to improve the document clustering using particle swarm optimization algorithm,” J. Comput. Sci., vol. 25, no. September, pp. 456–466, 2017, doi: 10.1016/j.jocs.2017.07.018.

[3] A. Saxena et al., “A Review of Clustering Techniques and Developments,” Neurocomputing, vol. 267, pp. 664–681, 2017, doi: 10.1016/j.neucom.2017.06.053.

[4] M. Bateni and S. Lattanzi, “Affinity Clustering : Hierarchical Clustering at Scale,” Adv. Neural Inf. Process. Syst., p. 6864–6874., 2017, available at: Google Scholar.

[5] G. Gan and M. K. Ng, “k -means clustering with outlier removal,” Pattern Recognit. Lett., vol. 90, pp. 8–14, 2017, doi: 10.1016/j.patrec.2017.03.008.

[6] B. Reddy, C. K., & Vinzamuri, “A survey of partitional and hierarchical clustering algorithms,” in DATA CLUSTERING Algorithms and Applications, 2018, pp. 87–110, doi: 10.1201/9781315373515-4.

[7] P. Ruppel, A. Küpper, B. Lorbeer, A. Kosareva, B. Deva, and D. Softi, “Variations on the Clustering Algorithm BIRCH,” Big data Res., vol. 11, pp. 44–53, 2018, doi: 10.1016/j.bdr.2017.09.002.

[8] S. Rana, S. Jasola, and R. Kumar, “A review on particle swarm optimization algorithms and their applications to data clustering,” Artif. Intell. Rev., vol. 35, no. 3, pp. 211–222, 2011, doi: 10.1007/s10462-010-9191-9.

[9] S. K. Bharne, P. K., Gulhane, V. S., & Yewale, “Data Clustering Algorithms Based On Swarm Intelligence,” Int. Conf. Electron. Comput. Technol., vol. 4, pp. 407–411, 2011, doi: 10.1109/ICECTECH.2011.5941931.

[10] E. Paoli, A., Melgani, F., & Pasolli, “Clustering of hyperspectral images based on multiobjective particle swarm optimization,” IEEE Trans. Geosci. Remote Sens., vol. 47, no. 12, pp. 4175–4188, 2009, doi: 10.1109/TGRS.2009.2023666.

[11] J. Al Moubayed, N, Petrovski, A., & McCall, “Clustering-Based Leaders ’ Selection in Multi-Objective Particle Swarm Optimisation,” Int. Conf. Intell. Data Eng. Autom. Learn., pp. 100–107, 2011, doi: 10.1007/978-3-642-23878-9_13.

[12] M. Alswaitti, M. Albughdadi, N. Ashidi, and M. Isa, “Density-based particle swarm optimization algorithm for data clustering,” Expert Syst. Appl., vol. 91, pp. 170–186, 2018, doi: 10.1016/j.eswa.2017.08.050.

[13] P. Santos et al., “Application of PSO-Based Clustering Algorithms on Educational Databases,” IEEE Lat. Am. Conf. Comput. Intell., pp. 1–6, 2017, doi: 10.1109/LA-CCI.2017.8285690.

[14] Y. Gupta and A. Saini, “A new swarm-based efficient data clustering approach using KHM and fuzzy logic,” Soft Comput., vol. 23, no. 1, pp. 145–162, 2019, doi: 10.1007/s00500-018-3514-1.

[15] J. Wang, Y. Cao, B. Li, H. Kim, and S. Lee, “Particle swarm optimization based clustering algorithm with mobile sink for WSNs,” Futur. Gener. Comput. Syst., vol. 76, pp. 452–457, 2017, doi: 10.1016/j.future.2016.08.004.

[16] R. Azzouz, S. Bechikh, and L. Ben Said, “Dynamic Multi-objective Optimization Using Evolutionary Algorithms : A Survey,” Recent Adv. Evol. multi-objective Optim., vol. 20, pp. 31–70, 2017, doi: 10.1007/978-3-319-42978-6_2.

[17] S. Mirjalili, J. S. Dong, and A. L. Editors, “Multi-verse optimizer: theory, literature review, and application in data clustering,” Nature-Inspired Optim., vol. 811, no. Springer, pp. 123–141, 2020, doi: 10.1007/978-3-030-12127-3_8.

[18] G. Armano and M. R. Farmani, “Multiobjective clustering analysis using particle swarm optimization,” Expert Syst. Appl., vol. 55, pp. 184–193, 2016, doi: 10.1016/j.eswa.2016.02.009.

[19] Z. Liu, B. Xiang, Y. Song, H. Lu, and Q. Liu, “An Improved Unsupervised Image Segmentation Method Based on Multi-Objective Particle Swarm Optimization Clustering Algorithm,” C. Comput, vol. 58, no. 2, pp. 451–461, 2019, doi: 10.32604/cmc.2019.04069.

[20] E. Mezura-montes and N. Cruz-, “Improved multi-objective clustering with automatic determination of the number of clusters,” Neural Comput. Appl., vol. 28, no. 8, pp. 2255–2275, 2017, doi: 10.1007/s00521-016-2191-1.

[21] S. E. L. Khediri, A. Dallali, and A. Kachouri, “Multi Objective Clustering Algorithm for Maximizing Lifetime in Wireless Sensor Networks,” J. Netw. Technol., vol. 8, no. 4, pp. 109–120, 2017, available at: Google Scholar.

[22] C. Gong, H. Chen, W. He, and Z. Zhang, “Improved multi-objective clustering algorithm using particle swarm optimization,” PloS one, vol. 12, no. 12, pp. 1–19, 2017, doi: 10.1371/journal.pone.0188815.

[23] J. Al Moubayed, N, Petrovski, A., & McCall, “D2MOPSO: MOPSO based on Decomposition and Dominance with Archiving using Crowding Distance in Objective and Solution Spaces,” Evol. Comput., vol. 22, no. 1, pp. 47–77, 2014, doi: 10.1162/EVCO_a_00104.

[24] Q. Zhu, Q. Lin, W. Chen, and K. Wong, “An External Archive-Guided Multiobjective Particle Swarm Optimization Algorithm,” IEEE Trans. Cybern., vol. 47, no. 9, p. 2794–2808., 2017, doi: 10.1109/TCYB.2017.2710133.

[25] E. J. of O. R. Lin, Q., Li, J., Du, Z., Chen, J., & Ming, Z., “A novel multi-objective particle swarm optimization with multiple search strategies.,” Eur. J. Oper. Res., vol. 247, no. 3, pp. 732–744, 2015, doi: 10.1016/j.ejor.2015.06.071.

[26] A. Pan, L. Wang, W. Guo, and Q. Wu, “A diversity enhanced multiobjective particle swarm optimization,” Inf. Sci. (Ny)., vol. 436–437, pp. 441–465, 2018, doi: 10.1016/j.ins.2018.01.038.

[27] A. B. R. A, H. Hamdan, N. Sulaiman, N. M. Sharef, and R. Yaakob, “Multi-Objective PSO-fuzzy Optimization Approach to Improve Interpretability and Accuracy in Medical Data,” Int. J. Eng. Technol., vol. 7, pp. 316–321, 2018, available at: Google Scholar.

[28] S. Kayal and N. Evin, “An adaptive neighbourhood construction algorithm based on density and connectivity,” Pattern Recognit. Lett., vol. 52, pp. 17–24, 2015, doi: 10.1016/j.patrec.2014.09.007.

[29] A. Abubaker and A. Baharum, “Multi-Objective Particle Swarm Optimization and Simulated Annealing in Practice,” Appl. Math. Sci., vol. 10, no. 42, pp. 2087–2103, 2016, doi: 10.12988/ams.2016.64159.

[30] V. Robert and V. Brault, “Comparing high dimensional partitions, with the coclustering adjusted rand index,” arXiv Prepr. arXiv1705.06760, 2017, available at: Google Scholar.




Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

___________________________________________________________
International Journal of Advances in Intelligent Informatics
ISSN 2442-6571  (print) | 2548-3161 (online)
Organized by UAD and ASCEE Computer Society
Published by Universitas Ahmad Dahlan
W: http://ijain.org
E: info@ijain.org (paper handling issues)
   andri.pranolo.id@ieee.org (publication issues)

View IJAIN Stats

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0