A comparison on classical-hybrid conjugate gradient method under exact line search

(1) * Nur Syarafina Mohamed Mail (Universiti Kuala Lumpur, Malaysian Institute of Industrial Technology, Malaysia)
(2) Mustafa Mamat Mail (Faculty of Informatics and Computing, Universiti Sultan Zainal Abidin (Unisza), Malaysia)
(3) Mohd Rivaie Mail (Department of Computer Sciences and Mathematics, Universiti Teknologi MARA (UiTM), Malaysia)
(4) Shazlyn Milleana Shaharudin Mail (Department of Mathematics, Universiti Pendidikan Sultan Idris, Malaysia)
*corresponding author

Abstract


One of the popular approaches in modifying the Conjugate Gradient (CG) Method is hybridization. In this paper, a new hybrid CG is introduced and its performance is compared to the classical CG method which are Rivaie-Mustafa-Ismail-Leong (RMIL) and Syarafina-Mustafa-Rivaie (SMR) methods. The proposed hybrid CG is evaluated as a convex combination of RMIL and SMR method. Their performance are analyzed under the exact line search. The comparison performance showed that the hybrid CG is promising and has outperformed the classical CG of RMIL and SMR in terms of the number of iterations and central processing unit per time.

   

DOI

https://doi.org/10.26555/ijain.v5i2.356
      

Article metrics

Abstract views : 137 | PDF views : 50

   

Cite

   

Full Text

Download

References


[1] M. Rivaie, M. Mamat, and A. Abashar, “A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches,” Appl. Math. Comput., vol. 268, pp. 1152–1163, Oct. 2015, doi: 10.1016/j.amc.2015.07.019.

[2] M. A. Hery, M. Ibrahim, and L. W. June, “BFGS method: a new search direction,” Sains Malaysiana, vol. 43, no. 10, pp. 1591–1597, 2014, available at: Google Scholar.

[3] M. Rivaie, M. Mamat, L. W. June, and I. Mohd, “A new class of nonlinear conjugate gradient coefficients with global convergence properties,” Appl. Math. Comput., vol. 218, no. 22, pp. 11323–11332, Jul. 2012, doi: https://doi.org/10.1016/j.amc.2012.05.030.

[4] N. S. Mohamed, M. Mamat, F. S. Mohamad, and M. Rivaie, “A new coefficient of conjugate gradient methods for nonlinear unconstrained optimization,” J. Teknol., vol. 78, no. 6–4, pp. 131–136, 2016, doi: https://doi.org/10.11113/jt.v78.8988.

[5] M. R. Hestenes and E. Stiefel, “Methods of conjugate gradients for solving linear systems,” J. Res. Natl. Bur. Stand. (1934)., vol. 49, no. 6, p. 409, Dec. 1952, doi: 10.6028/jres.049.044.

[6] R. Fletcher, “Function minimization by conjugate gradients,” Comput. J., vol. 7, no. 2, pp. 149–154, Feb. 1964, doi: 10.1093/comjnl/7.2.149.

[7] E. Polak and G. Ribiere, “Note sur la convergence de méthodes de directions conjuguées,” Rev. française d’informatique Rech. opérationnelle. Série rouge, vol. 3, no. 16, pp. 35–43, May 1969, doi: 10.1051/m2an/196903R100351.

[8] R. Fletcher, Practical Methods of Optimization: Unconstrained optimization, no. v. 1. J. Wiley, 1980, available at: https://books.google.co.id/books?id=igc8AQAAMAAJ.

[9] Y. H. Dai and Y. Yuan, “A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property,” SIAM J. Optim., vol. 10, no. 1, pp. 177–182, Jan. 1999, doi: 10.1137/S1052623497318992.

[10] Y. Liu and C. Storey, “Efficient generalized conjugate gradient algorithms, part 1: Theory,” J. Optim. Theory Appl., vol. 69, no. 1, pp. 129–137, Apr. 1991, doi: 10.1007/BF00940464.

[11] N. Andrei, “Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization,” J. Optim. Theory Appl., vol. 141, no. 2, pp. 249–264, May 2009, doi: 10.1007/s10957-008-9505-0.

[12] N. Andrei, “Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization,” Bull. Malaysian Math. Sci. Soc., vol. 34, no. 2, 2011, available at: Google Scholar.

[13] S. Shoid, M. Rivaie, and M. Mamat, “A modification of classical conjugate gradient method using strong Wolfe line search,” 2016, p. 020071, doi: 10.1063/1.4952551.

[14] A. Abashar, M. Mamat, M. Rivaie, I. Mohd, and O. Omer, “The proof of sufficient descent condition for a new type of conjugate gradient methods,” 2014, pp. 296–303, doi: 10.1063/1.4882502.

[15] N. H. A. Ghani, M. Rivaie, and M. Mamat, “A modified form of conjugate gradient method for unconstrained optimization problems,” 2016, p. 020076, doi: 10.1063/1.4952556.

[16] N. ‘Aini, M. Rivaie, and M. Mamat, “A modified conjugate gradient coefficient with inexact line search for unconstrained optimization,” 2016, p. 080019, doi: 10.1063/1.4968158.

[17] N. Hajar, M. Mamat, M. Rivaie, and Z. Salleh, “A combination of Polak-Ribiere and Hestenes-Steifel coefficient in conjugate gradient method for unconstrained optimization,” Appl. Math. Sci., vol. 9, pp. 3131–3142, 2015, doi: 10.12988/ams.2015.53242.

[18] M. Hamoda, M. Mamat, M. Rivaie, and Z. Salleh, “A conjugate gradient method with strong Wolfe-Powell line search for unconstrained optimization,” Appl. Math. Sci., vol. 10, pp. 721–734, 2016, doi: 10.12988/ams.2016.56449.

[19] M. Rivaie, A. Abashar, M. Mamat, and I. Mohd, “The convergence properties of a new type of conjugate gradient methods,” Appl. Math. Sci., vol. 8, pp. 33–44, 2014, doi: 10.12988/ams.2014.310578.

[20] N. S. Mohamed, M. Mamat, and M. Rivaie, “Solving a large scale nonlinear unconstrained optimization with exact line search direction by using new coefficient of conjugate gradient methods,” 2016, p. 080018, doi: 10.1063/1.4968157.

[21] N. Shapiee, M. Rivaie, and M. Mamat, “A new classical conjugate gradient coefficient with exact line search,” 2016, p. 020082, doi: 10.1063/1.4952562.

[22] Z. Z. Abidin, M. Mamat, M. Rivaie, and I. Mohd, “A new steepest descent method,” 2014, pp. 273–278, doi: 10.1063/1.4882499.

[23] A. Abashar, “A new type of conjugate gradient methods with sufficient descent property for unconstrained optimization,” Terengganu: Universiti Malaysia Terengganu, 2014, available at: Google Scholar.

[24] I. Jusoh, “New family of conjugate gradient methods with sufficient descent condition and global convergence for unconstrained optimization,” Universiti Sultan Zainal Abidin, 2015.

[25] J. Jian, L. Han, and X. Jiang, “A hybrid conjugate gradient method with descent property for unconstrained optimization,” Appl. Math. Model., vol. 39, no. 3–4, pp. 1281–1290, Feb. 2015, doi: 10.1016/j.apm.2014.08.008.

[26] M. J. D. Powell, “Nonconvex minimization calculations and the conjugate gradient method,” 1984, pp. 122–141, doi: 10.1007/BFb0099521.

[27] G. Zoutendijk, “Nonlinear programming, computational methods,” Integer nonlinear Program., pp. 37–86, 1970, available at: Google Scholar.

[28] Y. H. Dai and Y. Yuan, “An Efficient Hybrid Conjugate Gradient Method for Unconstrained Optimization,” Ann. Oper. Res., 2001, doi: 10.1023/A:1012930416777.

[29] N. Andrei, “An unconstrained optimization test functions collection,” Adv. Model. Optim, vol. 10, no. 1, pp. 147–161, 2008, available at: Google Scholar.

[30] J. J. Moré, B. S. Garbow, and K. E. Hillstrom, “Testing Unconstrained Optimization Software,” ACM Trans. Math. Softw., vol. 7, no. 1, pp. 17–41, Mar. 1981, doi: 10.1145/355934.355936.

[31] K. E. Hillstrom, “A Simulation Test Approach to the Evaluation of Nonlinear Optimization Algorithms,” ACM Trans. Math. Softw., vol. 3, no. 4, pp. 305–315, Dec. 1977, doi: 10.1145/355759.355760.

[32] E. D. Dolan and J. J. Moré, “Benchmarking optimization software with performance profiles,” Math. Program., vol. 91, no. 2, pp. 201–213, Jan. 2002, doi: 10.1007/s101070100263.

[33] N. S. Mohamed et al., “Estimating the Number of Paediatric Patients Using Least Square and Conjugate Gradient Methods,” J. Eng. Appl. Sci., vol. 12, no. 12, pp. 3068–3071, 2017, doi: 10.3923/jeasci.2017.3068.3071.

[34] N. Syarafina Mohamed, M. Mamat, M. Rivaie, N. Hamizah Abdul Ghani, N. Zull, and S. Syoid, “Estimating the unemployment rate using least square and conjugate gradient methods,” Int. J. Eng. Technol., vol. 7, no. 2.15, p. 94, Apr. 2018, doi: 10.14419/ijet.v7i2.15.11360.

[35] S. Shoid et al., “The application of new conjugate gradient methods in estimating data,” Int. J. Eng. Technol., vol. 7, no. 2.14, p. 25, Apr. 2018, doi: 10.14419/ijet.v7i2.14.11147.




Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

___________________________________________________________
International Journal of Advances in Intelligent Informatics
ISSN 2442-6571  (print) | 2548-3161 (online)
Organized by Informatics Department - Universitas Ahmad Dahlan , and UTM Big Data Centre - Universiti Teknologi Malaysia
Published by Universitas Ahmad Dahlan
W : http://ijain.org
E : info@ijain.org, andri.pranolo@tif.uad.ac.id (paper handling issues)
     ijain@uad.ac.id, andri.pranolo.id@ieee.org (publication issues)

View IJAIN Stats

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0