Log in

Continuation Newton methods with deflation techniques for global optimization problems

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

The global minimum point of an optimization problem is of interest in engineering fields and it is difficult to solve, especially for a nonconvex large-scale optimization problem. In this article, we consider a new memetic algorithm for this problem. That is to say, we use the continuation Newton method with the deflation technique to find multiple stationary points of the objective function and use those found stationary points as the initial seeds of the evolutionary algorithm, other than the random initial seeds of the known evolutionary algorithms. Meanwhile, in order to retain the usability of the derivative-free method and the fast convergence of the gradient-based method, we use the automatic differentiation technique to compute the gradient and replace the Hessian matrix with its finite difference approximation. According to our numerical experiments, this new algorithm works well for unconstrained optimization problems and finds their global minima efficiently, in comparison to the other representative global optimization methods such as the multi-start methods (the built-in subroutine GlobalSearch.m of MATLAB R2021b, GLODS, and VRBBO), the branch-and-bound method (Couenne, a state-of-the-art open-source solver for mixed integer nonlinear programming problems), and the derivative-free algorithms (CMA-ES and MCS).

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Algorithm 1
Algorithm 2
Algorithm 3
Algorithm 4
Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Availability of data and material (data transparency)

If it is requested, we will provide the test data.

Code availability (software application or custom code)

https://teacher.bupt.edu.cn/luoxinlong/zh_CN/zzcg/41406/list/index.htm.

References

  1. Abbott, J.P.: Numerical continuation methods for nonlinear equations and bifurcation problems. Ph.D. Thesis, Computer Center, Australian National University 1977

  2. Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10, 147–161 (2008)

    MathSciNet  Google Scholar 

  3. Adorio, E.P., Diliman, U.P.: MVF-multivariate test functions library in C for unconstrained global optimization. (2005) available at http://www.geocities.ws/eadorio/mvf.pdf

  4. Andricioaei, I., Straub, J.E.: Global optimization using bad derivatives: derivative-free method for molecular energy minimization. J. Comput. Chem. 19, 1445–1455 (1998)

    Article  CAS  Google Scholar 

  5. Allgower, E.L., Georg, K.: Introduction to numerical continuation methods. SIAM, Philadelphia, PA (2003)

    Book  Google Scholar 

  6. Ascher, U.M., Petzold, L.R.: Computer methods for ordinary differential equations and differential-algebraic equations. SIAM, Philadelphia, PA (1998)

    Book  Google Scholar 

  7. Axelsson, O., Sysala, S.: Continuation Newton methods. Comput. Math. Appl. 70, 2621–2637 (2015)

    Article  MathSciNet  Google Scholar 

  8. Averick, B. M., Carter, R. G., Moré, J. J., Xue, G. L.: The MINIPACK-2 test problem collection, Mathematics and Computer Science Division, Agronne National Laboratory, Preprint MCS-P153-0692, 1992

  9. Baydin, A.G., Pearlmutter, B.A., Radul, A.A., Siskind, J.M.: Automatic differentiation in machine learning: a survey. J. Mach. Learn. Res. 18, 5595–5637 (2017)

    MathSciNet  Google Scholar 

  10. Belotti, P., Lee, J., Liberti, L., Margot, F., Wächter, A.: Branching and bounds tightening techniques for non-convex MINLP. Optim. Methods Softw. 24, 597–634 (2009)

    Article  MathSciNet  Google Scholar 

  11. Boender, C.G.E.: Bayesian stop** rules for multistart global optimization methods. Math. Program. 37, 59–80 (1987)

    Article  ADS  MathSciNet  Google Scholar 

  12. Branin, F.H.: Widely convergent method for finding multiple solutions of simultaneous nonlinear equations. IBM J. Res. Dev. 16, 504–521 (1972)

    Article  MathSciNet  Google Scholar 

  13. Brown, K.M., Gearhart, W.B.: Deflation techniques for the calculation of further solutions of a nonlinear system. Numer. Math. 16, 334–342 (1971)

    Article  MathSciNet  Google Scholar 

  14. Braden, A.: Optimisation techniques for solving design problems in modern trombones. In: Forum Acusticum 557–662 (2005)

  15. Conn, A.R., Gould, N., Toint, Ph.L.: Trust-region methods. SIAM, Philadelphia, PA (2000)

  16. Conn, A.R., Scheinberg, K., Vicente, L.N.: Introduction derivative-free optimization. SIAM, Philadelphia, PA (2009)

    Book  Google Scholar 

  17. Couenne.: a solver for non-convex MINLP problems, available at https://www.coin-or.org/Couenne/, February (2020)

  18. CMA-ES.: the covariance matrix adaptation evolution strategy, available at http://www.cmap.polytechnique.fr/~nikolaus.hansen/cmaes.m, (2012)

  19. Czyzyk, J., Mesnier, M.P., Moré, J.J.: The NEOS Server. IEEE Comput. Sci. Eng. 5, 68–75 (1998)

    Article  Google Scholar 

  20. Custódio, Madeira, J.F.A.: GLODS: global and local optimization using direct search. J. Glob. Optim. 62, 1–28 (2015). https://doi.org/10.1007/s10898-014-0224-9

  21. Davidenko, D.F.: On a new method of numerical solution of systems of nonlinear equations (in Russian). Dokl. Akad. Nauk SSSR 88, 601–602 (1953)

    Google Scholar 

  22. Deuflhard, P.: Newton methods for nonlinear problems: affine invariance and adaptive algorithms. Springer-Verlag, Berlin (2004)

    Google Scholar 

  23. Dolan, E.D.: The NEOS Server 4.0 administrative guide, Technical Memorandum ANL/MCS-TM-250, Mathematics and Computer Science Division. Argonne National Laboratory (2001)

  24. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program 91, 201–213 (2002)

    Article  MathSciNet  Google Scholar 

  25. Deuflhard, P., Pesch, H.J., Rentrop, P.: A modified continuation method for the numerical solution of nonlinear two-point boundary value problems by shooting techniques. Numer. Math. 26, 327–343 (1975)

    Article  MathSciNet  Google Scholar 

  26. Dennis, J.E., Schnabel, R.B.: Numerical methods for unconstrained optimization and nonlinear equations. SIAM, Philadelphia, PA (1996)

    Book  Google Scholar 

  27. Dong, H.C., Song, B.W., Dong, Z.M., Wang, P.: Multi-start space reduction (MSSR) surrogate-based global optimization method. Struct. Multidisc. Optim. 54, 907–926 (2016)

    Article  Google Scholar 

  28. Elhara, O., Varelas, K., Nguyen, D., Tusar, T., Brockhoff, D., Hansen, N., Auger, A.: COCO: The large scale black-box optimization benchmarking (bbob-largescale) Test Suite, ar**v preprint available at https://arxiv.org/abs/1903.06396 (2019)

  29. Gao, W., Mi, C.: Hybrid vehicle design using global optimisation algorithms. Int. J. Electric Hybrid Veh. 1, 57–70 (2007)

    Article  Google Scholar 

  30. Gropp, W., Moré, J. J.: Optimization environments and the NEOS server. In: Buhmann, M.D., Iserles, A. (eds.) Approximation Theory and Optimization, Cambridge University Press, (1997)

  31. Golub, G.H., Van Loan, C.F.: Matrix computation, 4th edn. The John Hopkins University Press, Baltimore (2013)

    Book  Google Scholar 

  32. Griewank, A., Walther, A.: Evaluating derivatives: principles and techniques of algorithmic differentiation, SIAM, Philadelphia, (2008). https://doi.org/10.1137/1.9780898717761

  33. Gould, N.I.M, Orban, D., Toint, Ph.L.: CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization. Comput. Optim. Appl.60, 545–557 (2015). https://www.cuter.rl.ac.uk/mastsif.html

  34. Hansen, N.: The CMA evolution strategy: a comparing review. In: Lozano, J.A., Larranaga, P., Inza, I., Bengoetxea, E. (eds.) Towards a new evolutionary computation, pp. 75–102. Advances on Estimation of Distribution Algorithms, Springer, Berlin (2006)

    Chapter  Google Scholar 

  35. Hansen, N.: The CMA evolution strategy: a tutorial, available at https://arxiv.org/abs/1604.00772 (2010)

  36. Hansen, C.H., Simpson, M.T., Cazzolato, B.S.: Active sound and vibration control: theory and applications, chapter 9: genetic algorithms for optimising ASVC systems, pp. 185-220, No. 62 in IEE control engineering series, London, UK (2002)

  37. Hart, W.E.: Adaptive global optimization with local search, Ph.D. dissertation, University of California, San Diego, CA, USA, (1994)

  38. Higham, D.J.: Trust region algorithms and timestep selection. SIAM J. Numer. Anal. 37, 194–210 (1999)

    Article  MathSciNet  Google Scholar 

  39. Huyer, W., Neumaier, A.: Global optimization by multilevel coordinate search. J. Glob. Optim. 14, 331–355 (1999)

    Article  MathSciNet  Google Scholar 

  40. Hairer, E., Wanner, G.: Solving ordinary differential equations II. Stiff and differential-algebraic problems, 2nd edn. Springer-Verlag, Berlin (1996)

  41. Jackiewicz, Z.: General linear methods for ordinary differential equations. John Wiley & Sons Inc, Hoboken, New Jersey (2009)

    Book  Google Scholar 

  42. Kearfott, R.B.: Rigorous global search: continuous problems. Nonconvex Optimization and Applications, Kluwer Academic, Dordrecht (1996)

    Book  Google Scholar 

  43. Kelley, C.T.: Solving nonlinear equations with Newton’s method. SIAM, Philadelphia, PA (2003)

    Book  Google Scholar 

  44. Kelley, C.T.: Numerical methods for nonlinear equations. Acta Numer. 27, 207–287 (2018)

    Article  MathSciNet  Google Scholar 

  45. Kimiaei, M., Neumaier, A.: Efficient unconstrained black box optimization, Math. Program. Comput. 14 (2022), 365-414. https://doi.org/10.1007/s12532-021-00215-9. Software available at https://arnold-neumaier.at/software/VRBBO/

  46. Kvasov, D.E., Sergeyev, Y.D.: Lipschitz gradients for global optimization in a one-point-based partitioning scheme. J. Comput. Appl. Math. 236, 4042–4054 (2012)

    Article  MathSciNet  Google Scholar 

  47. Lambert, J.D.: Computational methods in ordinary differential equations. John Wiley, (1973)

  48. Lavor, C., Maculan, N.: A function to test methods applied to global minimization of potential energy of molecules. Numer. Algorithms 35, 287–300 (2004)

    Article  ADS  MathSciNet  Google Scholar 

  49. Leung, Y.-W., Wang, Y.P.: An orthogonal genetic algorithm with quantization for global numerical optimization. IEEE Trans. Evol. Comput. 5, 41–53 (2001)

    Article  Google Scholar 

  50. Liu, S.-T., Luo, X.-L.: A method based on Rayleigh quotient gradient flow for extreme and interior eigenvalue problems. Linear Algebra Appl. 432, 1851–1863 (2010)

    Article  MathSciNet  Google Scholar 

  51. Luo, X.-L.: Singly diagonally implicit Runge-Kutta methods combining line search techniques for unconstrained optimization. J. Comput. Math. 23, 153–164 (2005)

    MathSciNet  Google Scholar 

  52. Luo, X.-L., Kelley, C.T., Liao, L.-Z., Tam, H.-W.: Combining trust region techniques and Rosenbrock methods to compute stationary points. J. Optim. Theory Appl. 140, 265–286 (2009)

    Article  MathSciNet  Google Scholar 

  53. Luo, X.-L.: A second-order pseudo-transient method for steady-state problems. Appl. Math. Comput. 216, 1752–1762 (2010)

    MathSciNet  Google Scholar 

  54. Luo, X.-L.: A dynamical method of DAEs for the smallest eigenvalue problem. J. Comput. Sci. 3, 113–119 (2012)

    Article  Google Scholar 

  55. Luo, X.-L., Lv, J.-H., Sun, G.: Continuation method with the trusty time-step** scheme for linearly constrained optimization with noisy data, Optim. Eng. 23, 329–360 (2022). http://doi.org/10.1007/s11081-020-09590-z

  56. Luo, X.-L., ** scheme for nonlinear equations, Numer. Algorithms 89, 223–247 (2022). http://doi.org/10.1007/s11075-021-01112-x

  57. Luo, X.-L., Yao, Y.Y.: Primal-dual path-following methods and the trust-region strategy for linear programming with noisy data, J. Comput. Math. 40, 760–780 (2022). http://doi.org/10.4208/jcm.2101-m2020-0173

  58. Luo, X.-L., **ao, H., Lv, J.-H., Zhang, S.: Explicit pseudo-transient continuation and the trust-region updating strategy for unconstrained optimization. Appl. Numer. Math. 165, 290–302 (2021). http://doi.org/10.1016/j.apnum.2021.02.019

  59. Luo, X.-L., **ao, H.: Generalized continuation Newton methods and the trust-region updating strategy for the underdetermined system, J. Sci. comput. 88, article 56, 1–22 (2021). http://doi.org/10.1007/s10915-021-01566-0

  60. Luo, X.-L., **ao, H.: The regularization continuation method with an adaptive time step control for linearly constrained optimization problems, Appl. Numer. Math. 181, 255–276 (2022). https://doi.org/10.1016/j.apnum.2022.06.008

  61. Luo, X.-L., Zhang, S., **ao, H.: Regularization path-following methods with the trust-region updating strategy for linear complementarity problems. ar**v preprint available at http://arxiv.org/abs/2205.10727, pp. 1-30, May 21, (2022)

  62. Luo, X.-L., **ao, H., Zhang, S.: The regularization continuation method for optimization problems with nonlinear equality constraints. ar**v preprint available at http://arxiv.org/abs/2303.14692, pp. 1–41, March 28, (2023)

  63. Man, K.F., Tang, K.S., Kwong, S.: Genetic algorithms: concepts and designs. Springer, Berlin (1999)

    Book  Google Scholar 

  64. Macêdo, M.J.F.G., Karas, E.W., Costa, M.F.P., Rocha, A.M.A.C.: Filter-based stochastic algorithm for global optimization. J. Glob. Optim. 77, 777–805 (2020)

    Article  MathSciNet  Google Scholar 

  65. MATLAB R2021b.: The MathWorks Inc., http://www.mathworks.com, (2021)

  66. MCS.: The multilevel coordinate search, available at https://www.mat.univie.ac.at/~neum/software/mcs/, (2000)

  67. Mitchell, M.: An introduction to genetic algorithms. MIT press, Cambridge, MA (1996)

  68. Moré, J.J., Garbow, B.S., Hillstrom, K.E.: Testing unconstrained optimization software. ACM Trans. Math. Soft. 7, 17–41 (1981)

    Article  MathSciNet  Google Scholar 

  69. Moscato, P.: On evolution, search, optimization, gas and martial arts: toward memetic algorithms, Technical report, Caltech Concurrent Computation Program 158–79. California Institute of Technology, Pasadena, California (1989)

    Google Scholar 

  70. Morgans, R.C., Howard, C.Q., Zander, A.C., Hansen, C.H., Murphy, D.J.: Derivative free optimisation in engineering and acoustics, 14th International Congress on Sound & Vibration, 1–8 (2007)

  71. Neidinger, R.D.: Introduction to automatic differentiation and MATLAB object-oriented programming. SIAM Rev. 52, 545–563 (2010). https://doi.org/10.1137/080743627

    Article  MathSciNet  Google Scholar 

  72. Neumaier, A.: MCS: global optimization by multilevel coordinate search. (2000)https://www.mat.univie.ac.at/~neum/software/mcs/

  73. NEOS Server.: (2021). https://neos-server.org/neos/

  74. Nocedal, J., Wright, S.J.: Numerical optimization. Springer-Verlag, Berlin (1999)

    Book  Google Scholar 

  75. Ortega, J.M., Rheinboldt, W.C.: Iteration solution of nonlinear equations in several variables. SIAM, Philadelphia, PA (2000)

    Book  Google Scholar 

  76. Regis, R.G., Shoemaker, C.A.: A quasi-multistart framework for global optimization of expensive functions using response surface models. J. Glob. Optim. 56, 1719–1753 (2013)

    Article  MathSciNet  Google Scholar 

  77. Rios, L.M., Sahinidis, N.V.: Derivative-free optimization: a review of algorithms and comparison of software implementations. J. Glob. Optim. 56, 1247–1293 (2013)

    Article  MathSciNet  Google Scholar 

  78. Rosenbrock, H.H.: An automatic method for finding the greatest or least value of a function, Comput. J. 3, 175–184 (1960). Aailable online at http://comjnl.oxfordjournals.org/content/3/3/175.full.pdf

  79. Shampine, L.F., Gladwell, I., Thompson, S.: Solving ODEs with MATLAB. Cambridge University Press, Cambridge (2003)

    Book  Google Scholar 

  80. Sahinidis, N. V.: BARON 21.1.13: Global optimization of mixed-integer nonlinear programs, user’s manual (2021). Available at https://minlp.com/downloads/docs/baron manual.pdf

  81. Surjanovic, S., Bingham, D.: Virtual library of simulation experiments: test functions and datasets, available at http://www.sfu.ca/~ssurjano, January (2020)

  82. Sun, J., Garibaldi, J.M., Krasnogor, N., Zhang, Q.: An intelligent muti-restart memetic algorithm for box constrained global optimisation. Evol. Comput. 21, 107–147 (2014)

    Article  CAS  Google Scholar 

  83. Sergeyev, Y.D., Kvasov, D.E., Mukhametzhanov, M.S.: On the efficiency of nature-inspired metaheuristics in expensive global optimization with limited budget. Sci. Rep. 8, 1–9 (2018)

    Article  CAS  Google Scholar 

  84. Sergeyev, Y.D., Kvasov, D.E.: Deterministic global optimization: an introduction to the diagonal approach, Springer, (2017)

  85. Sergeyev, Y.D., Kvasov, D.E., Mukhametzhanov, M.S.: Operational zones for comparing metaheuristic and deterministic one-dimensional global optimization algorithms. Math. Comput. Simul. 141, 96–109 (2017)

    Article  MathSciNet  Google Scholar 

  86. Sergeyev, Y.D., Kvasov, D.E.: A deterministic global optimization using smooth diagonal auxiliary functions. Commun. Nonlinear Sci. 21, 99–111 (2015)

    Article  MathSciNet  Google Scholar 

  87. Sun, W.-Y., Yuan, Y.-X.: Optimization theory and methods: nonlinear programming. Springer, Berlin (2006)

    Google Scholar 

  88. Tanabe, K.: Continuous Newton-Raphson method for solving an underdetermined system of nonlinear equations. Nonlinear Anal. 3, 495–503 (1979)

    Article  MathSciNet  Google Scholar 

  89. Tawarmalani, M., Sahinidis, N.V.: A polyhedral branch-and-cut approach to global optimization. Math. Program. 103, 225–249 (2005)

    Article  MathSciNet  Google Scholar 

  90. Teughels, A., Roeck, G.. De., Suykens, J.A.K.: Global optimization by coupled local minimizers and its application to FE model updating. Comput. Sturct. 81, 2337–2351 (2003)

  91. Ugray, Z., Lasdon, L., Plummer, J., Glover, F., Kelly, J., Marti, R.: Scatter search and local NLP solvers: a multistart framework for global optimization. INFORMS J. Comput. 19, 328–340 (2007)

    Article  MathSciNet  Google Scholar 

  92. Willkomm, J., Vehreschild, A.: The ADiMat handbook, (2013). http://adimat.sc.informatik.tu-darmstadt.de/doc/

  93. Willkomm, J., Bischof, C.H., Bücker, H.M.: A new user interface for ADiMat: toward accurate and efficient derivatives of MATLAB programmes with ease of use. Int. J. Comput. Sci. Eng. 9, 408–415 (2014)

    Google Scholar 

  94. Xu, J., Nannariello, J., Fricke, F.R.: Optimising flat-walled multi-layered anechoic linings using evolutionary algorithms. Appl. Acoust. 65, 1009–1026 (2004)

    Article  Google Scholar 

  95. Yuan, Y.-X.: Trust region algorithms for nonlinear equations. Information 1, 7–20 (1998)

  96. Yuan, Y.-X.: Recent advances in trust region algorithms. Math. Program. 151, 249–281 (2015)

    Article  MathSciNet  Google Scholar 

  97. Žilinskas, A., Gillard, J., Scammell, M., Zhiglijavsky, A.: Multistart with early termination of descents. J. Glob. Optim. 79, 447–462 (2021)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

The authors are grateful to Prof. Jonathan Eckstein for the suggestion of the comparison to Couenne and Prof. Nick Sahinidis for the suggestion of the comparison to the derivative-free optimization methods such as MCS and CMA-ES. The authors are grateful to two anonymous referees for their comments and suggestions which greatly improve the presentation of this paper.

Funding

This work was supported in part by Grants 61876199 and 62376036 from National Natural Science Foundation of China, Grant YBWL2011085 from Huawei Technologies Co., Ltd., and Grant YJCB2011003HI from the Innovation Research Program of Huawei Technologies Co., Ltd..

Author information

Authors and Affiliations

Authors

Contributions

**n-long Luo and Hang **ao wrote the main manuscript text and Sen Zhang performed all test problems and prepared Figs. 1–4. All authors reviewed the manuscript.

Corresponding author

Correspondence to **n-long Luo.

Ethics declarations

Ethical approval

Not applicable

Conflict of interest

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix A Tables of Numerical Results

Appendix A Tables of Numerical Results

Table 3 Numerical results of large-scale problems (no. 1–17) computed by CNMGE, CNMTrM, CNMDTM, and CNMGE_AG
Table 4 Numerical results of large-scale problems (no. 18–34) computed by CNMGE, CNMTrM, CNMDTM, and CNMGE_AG
Table 5 Numerical results of small-scale problems (no. 35–50) computed by CNMGE, CNMTrM, CNMDTM, and CNMGE_AG
Table 6 Numerical results of small-scale problems (no. 51–68) computed by CNMGE, CNMTrM, CNMDTM, and CNMGE_AG
Table 7 Numerical results of large-scale problems (no. 1–17) computed by CNMGE, GlobalSearch, Couenne, and CMA-ES
Table 8 Numerical results of large-scale problems (no. 18–34) computed by CNMGE, GlobalSearch, Couenne, and CMA-ES
Table 9 Numerical results of small-scale problems (no. 35–51) computed by CNMGE, GlobalSearch, Couenne, and CMA-ES
Table 10 Numerical results of small-scale problems (no. 52–68) computed by CNMGE, GlobalSearch, Couenne, and CMA-ES
Table 11 Numerical results of large-scale problems (no. 1–17) computed by CNMGE, MCS, GLODS, and VRBBO
Table 12 Numerical results of large-scale problems (no. 18–34) computed by CNMGE, MCS, GLODS, and VRBBO
Table 13 Numerical results of small-scale problems (no. 35–51) computed by CNMGE, MCS, GLODS, and VRBBO
Table 14 Numerical results of small-scale problems (no. 52–68) computed by CNMGE, MCS, GLODS, and VRBBO
Table 15 Numerical results of CUTEst problems [33] (no. 69–84) computed by CNMGE, GlobalSearch, Couenne, and CMA-ES
Table 16 Numerical results of CUTEst problems [33] (no. 85–100) computed by CNMGE, GlobalSearch, Couenne, and CMA-ES
Table 17 Numerical results of CUTEst problems [33] (no. 101–116) computed by CNMGE, GlobalSearch, Couenne, and CMA-ES
Table 18 Numerical results of CUTEst problems [33] (no. 117–132) computed by CNMGE, GlobalSearch, Couenne, and CMA-ES
Table 19 Numerical results of CUTEst problems [33] (no. 133–148) computed by CNMGE, GlobalSearch, Couenne, and CMA-ES
Table 20 Numerical results of CUTEst problems [33] (no. 69–84) computed by CNMGE, MCS, GLODS, and VRBBO
Table 21 Numerical results of CUTEst problems [33] (no. 85–100) computed by CNMGE, MCS, GLODS, and VRBBO
Table 22 Numerical results of CUTEst problems [33] (no. 101–116) computed by CNMGE, MCS, GLODS, and VRBBO
Table 23 Numerical results of CUTEst problems [33] (no. 117–132) computed by CNMGE, MCS, GLODS, and VRBBO
Table 24 Numerical results of CUTEst problems [33] (no.133-148) computed by CNMGE, MCS, GLODS, and VRBBO

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Luo, Xl., **ao, H. & Zhang, S. Continuation Newton methods with deflation techniques for global optimization problems. Numer Algor (2024). https://doi.org/10.1007/s11075-024-01768-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11075-024-01768-1

Keywords

Mathematics Subject Classification (2010)

Navigation