Abstract
The global minimum point of an optimization problem is of interest in engineering fields and it is difficult to solve, especially for a nonconvex large-scale optimization problem. In this article, we consider a new memetic algorithm for this problem. That is to say, we use the continuation Newton method with the deflation technique to find multiple stationary points of the objective function and use those found stationary points as the initial seeds of the evolutionary algorithm, other than the random initial seeds of the known evolutionary algorithms. Meanwhile, in order to retain the usability of the derivative-free method and the fast convergence of the gradient-based method, we use the automatic differentiation technique to compute the gradient and replace the Hessian matrix with its finite difference approximation. According to our numerical experiments, this new algorithm works well for unconstrained optimization problems and finds their global minima efficiently, in comparison to the other representative global optimization methods such as the multi-start methods (the built-in subroutine GlobalSearch.m of MATLAB R2021b, GLODS, and VRBBO), the branch-and-bound method (Couenne, a state-of-the-art open-source solver for mixed integer nonlinear programming problems), and the derivative-free algorithms (CMA-ES and MCS).
Similar content being viewed by others
Availability of data and material (data transparency)
If it is requested, we will provide the test data.
Code availability (software application or custom code)
References
Abbott, J.P.: Numerical continuation methods for nonlinear equations and bifurcation problems. Ph.D. Thesis, Computer Center, Australian National University 1977
Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10, 147–161 (2008)
Adorio, E.P., Diliman, U.P.: MVF-multivariate test functions library in C for unconstrained global optimization. (2005) available at http://www.geocities.ws/eadorio/mvf.pdf
Andricioaei, I., Straub, J.E.: Global optimization using bad derivatives: derivative-free method for molecular energy minimization. J. Comput. Chem. 19, 1445–1455 (1998)
Allgower, E.L., Georg, K.: Introduction to numerical continuation methods. SIAM, Philadelphia, PA (2003)
Ascher, U.M., Petzold, L.R.: Computer methods for ordinary differential equations and differential-algebraic equations. SIAM, Philadelphia, PA (1998)
Axelsson, O., Sysala, S.: Continuation Newton methods. Comput. Math. Appl. 70, 2621–2637 (2015)
Averick, B. M., Carter, R. G., Moré, J. J., Xue, G. L.: The MINIPACK-2 test problem collection, Mathematics and Computer Science Division, Agronne National Laboratory, Preprint MCS-P153-0692, 1992
Baydin, A.G., Pearlmutter, B.A., Radul, A.A., Siskind, J.M.: Automatic differentiation in machine learning: a survey. J. Mach. Learn. Res. 18, 5595–5637 (2017)
Belotti, P., Lee, J., Liberti, L., Margot, F., Wächter, A.: Branching and bounds tightening techniques for non-convex MINLP. Optim. Methods Softw. 24, 597–634 (2009)
Boender, C.G.E.: Bayesian stop** rules for multistart global optimization methods. Math. Program. 37, 59–80 (1987)
Branin, F.H.: Widely convergent method for finding multiple solutions of simultaneous nonlinear equations. IBM J. Res. Dev. 16, 504–521 (1972)
Brown, K.M., Gearhart, W.B.: Deflation techniques for the calculation of further solutions of a nonlinear system. Numer. Math. 16, 334–342 (1971)
Braden, A.: Optimisation techniques for solving design problems in modern trombones. In: Forum Acusticum 557–662 (2005)
Conn, A.R., Gould, N., Toint, Ph.L.: Trust-region methods. SIAM, Philadelphia, PA (2000)
Conn, A.R., Scheinberg, K., Vicente, L.N.: Introduction derivative-free optimization. SIAM, Philadelphia, PA (2009)
Couenne.: a solver for non-convex MINLP problems, available at https://www.coin-or.org/Couenne/, February (2020)
CMA-ES.: the covariance matrix adaptation evolution strategy, available at http://www.cmap.polytechnique.fr/~nikolaus.hansen/cmaes.m, (2012)
Czyzyk, J., Mesnier, M.P., Moré, J.J.: The NEOS Server. IEEE Comput. Sci. Eng. 5, 68–75 (1998)
Custódio, Madeira, J.F.A.: GLODS: global and local optimization using direct search. J. Glob. Optim. 62, 1–28 (2015). https://doi.org/10.1007/s10898-014-0224-9
Davidenko, D.F.: On a new method of numerical solution of systems of nonlinear equations (in Russian). Dokl. Akad. Nauk SSSR 88, 601–602 (1953)
Deuflhard, P.: Newton methods for nonlinear problems: affine invariance and adaptive algorithms. Springer-Verlag, Berlin (2004)
Dolan, E.D.: The NEOS Server 4.0 administrative guide, Technical Memorandum ANL/MCS-TM-250, Mathematics and Computer Science Division. Argonne National Laboratory (2001)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program 91, 201–213 (2002)
Deuflhard, P., Pesch, H.J., Rentrop, P.: A modified continuation method for the numerical solution of nonlinear two-point boundary value problems by shooting techniques. Numer. Math. 26, 327–343 (1975)
Dennis, J.E., Schnabel, R.B.: Numerical methods for unconstrained optimization and nonlinear equations. SIAM, Philadelphia, PA (1996)
Dong, H.C., Song, B.W., Dong, Z.M., Wang, P.: Multi-start space reduction (MSSR) surrogate-based global optimization method. Struct. Multidisc. Optim. 54, 907–926 (2016)
Elhara, O., Varelas, K., Nguyen, D., Tusar, T., Brockhoff, D., Hansen, N., Auger, A.: COCO: The large scale black-box optimization benchmarking (bbob-largescale) Test Suite, ar**v preprint available at https://arxiv.org/abs/1903.06396 (2019)
Gao, W., Mi, C.: Hybrid vehicle design using global optimisation algorithms. Int. J. Electric Hybrid Veh. 1, 57–70 (2007)
Gropp, W., Moré, J. J.: Optimization environments and the NEOS server. In: Buhmann, M.D., Iserles, A. (eds.) Approximation Theory and Optimization, Cambridge University Press, (1997)
Golub, G.H., Van Loan, C.F.: Matrix computation, 4th edn. The John Hopkins University Press, Baltimore (2013)
Griewank, A., Walther, A.: Evaluating derivatives: principles and techniques of algorithmic differentiation, SIAM, Philadelphia, (2008). https://doi.org/10.1137/1.9780898717761
Gould, N.I.M, Orban, D., Toint, Ph.L.: CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization. Comput. Optim. Appl.60, 545–557 (2015). https://www.cuter.rl.ac.uk/mastsif.html
Hansen, N.: The CMA evolution strategy: a comparing review. In: Lozano, J.A., Larranaga, P., Inza, I., Bengoetxea, E. (eds.) Towards a new evolutionary computation, pp. 75–102. Advances on Estimation of Distribution Algorithms, Springer, Berlin (2006)
Hansen, N.: The CMA evolution strategy: a tutorial, available at https://arxiv.org/abs/1604.00772 (2010)
Hansen, C.H., Simpson, M.T., Cazzolato, B.S.: Active sound and vibration control: theory and applications, chapter 9: genetic algorithms for optimising ASVC systems, pp. 185-220, No. 62 in IEE control engineering series, London, UK (2002)
Hart, W.E.: Adaptive global optimization with local search, Ph.D. dissertation, University of California, San Diego, CA, USA, (1994)
Higham, D.J.: Trust region algorithms and timestep selection. SIAM J. Numer. Anal. 37, 194–210 (1999)
Huyer, W., Neumaier, A.: Global optimization by multilevel coordinate search. J. Glob. Optim. 14, 331–355 (1999)
Hairer, E., Wanner, G.: Solving ordinary differential equations II. Stiff and differential-algebraic problems, 2nd edn. Springer-Verlag, Berlin (1996)
Jackiewicz, Z.: General linear methods for ordinary differential equations. John Wiley & Sons Inc, Hoboken, New Jersey (2009)
Kearfott, R.B.: Rigorous global search: continuous problems. Nonconvex Optimization and Applications, Kluwer Academic, Dordrecht (1996)
Kelley, C.T.: Solving nonlinear equations with Newton’s method. SIAM, Philadelphia, PA (2003)
Kelley, C.T.: Numerical methods for nonlinear equations. Acta Numer. 27, 207–287 (2018)
Kimiaei, M., Neumaier, A.: Efficient unconstrained black box optimization, Math. Program. Comput. 14 (2022), 365-414. https://doi.org/10.1007/s12532-021-00215-9. Software available at https://arnold-neumaier.at/software/VRBBO/
Kvasov, D.E., Sergeyev, Y.D.: Lipschitz gradients for global optimization in a one-point-based partitioning scheme. J. Comput. Appl. Math. 236, 4042–4054 (2012)
Lambert, J.D.: Computational methods in ordinary differential equations. John Wiley, (1973)
Lavor, C., Maculan, N.: A function to test methods applied to global minimization of potential energy of molecules. Numer. Algorithms 35, 287–300 (2004)
Leung, Y.-W., Wang, Y.P.: An orthogonal genetic algorithm with quantization for global numerical optimization. IEEE Trans. Evol. Comput. 5, 41–53 (2001)
Liu, S.-T., Luo, X.-L.: A method based on Rayleigh quotient gradient flow for extreme and interior eigenvalue problems. Linear Algebra Appl. 432, 1851–1863 (2010)
Luo, X.-L.: Singly diagonally implicit Runge-Kutta methods combining line search techniques for unconstrained optimization. J. Comput. Math. 23, 153–164 (2005)
Luo, X.-L., Kelley, C.T., Liao, L.-Z., Tam, H.-W.: Combining trust region techniques and Rosenbrock methods to compute stationary points. J. Optim. Theory Appl. 140, 265–286 (2009)
Luo, X.-L.: A second-order pseudo-transient method for steady-state problems. Appl. Math. Comput. 216, 1752–1762 (2010)
Luo, X.-L.: A dynamical method of DAEs for the smallest eigenvalue problem. J. Comput. Sci. 3, 113–119 (2012)
Luo, X.-L., Lv, J.-H., Sun, G.: Continuation method with the trusty time-step** scheme for linearly constrained optimization with noisy data, Optim. Eng. 23, 329–360 (2022). http://doi.org/10.1007/s11081-020-09590-z
Luo, X.-L., ** scheme for nonlinear equations, Numer. Algorithms 89, 223–247 (2022). http://doi.org/10.1007/s11075-021-01112-x
Luo, X.-L., Yao, Y.Y.: Primal-dual path-following methods and the trust-region strategy for linear programming with noisy data, J. Comput. Math. 40, 760–780 (2022). http://doi.org/10.4208/jcm.2101-m2020-0173
Luo, X.-L., **ao, H., Lv, J.-H., Zhang, S.: Explicit pseudo-transient continuation and the trust-region updating strategy for unconstrained optimization. Appl. Numer. Math. 165, 290–302 (2021). http://doi.org/10.1016/j.apnum.2021.02.019
Luo, X.-L., **ao, H.: Generalized continuation Newton methods and the trust-region updating strategy for the underdetermined system, J. Sci. comput. 88, article 56, 1–22 (2021). http://doi.org/10.1007/s10915-021-01566-0
Luo, X.-L., **ao, H.: The regularization continuation method with an adaptive time step control for linearly constrained optimization problems, Appl. Numer. Math. 181, 255–276 (2022). https://doi.org/10.1016/j.apnum.2022.06.008
Luo, X.-L., Zhang, S., **ao, H.: Regularization path-following methods with the trust-region updating strategy for linear complementarity problems. ar**v preprint available at http://arxiv.org/abs/2205.10727, pp. 1-30, May 21, (2022)
Luo, X.-L., **ao, H., Zhang, S.: The regularization continuation method for optimization problems with nonlinear equality constraints. ar**v preprint available at http://arxiv.org/abs/2303.14692, pp. 1–41, March 28, (2023)
Man, K.F., Tang, K.S., Kwong, S.: Genetic algorithms: concepts and designs. Springer, Berlin (1999)
Macêdo, M.J.F.G., Karas, E.W., Costa, M.F.P., Rocha, A.M.A.C.: Filter-based stochastic algorithm for global optimization. J. Glob. Optim. 77, 777–805 (2020)
MATLAB R2021b.: The MathWorks Inc., http://www.mathworks.com, (2021)
MCS.: The multilevel coordinate search, available at https://www.mat.univie.ac.at/~neum/software/mcs/, (2000)
Mitchell, M.: An introduction to genetic algorithms. MIT press, Cambridge, MA (1996)
Moré, J.J., Garbow, B.S., Hillstrom, K.E.: Testing unconstrained optimization software. ACM Trans. Math. Soft. 7, 17–41 (1981)
Moscato, P.: On evolution, search, optimization, gas and martial arts: toward memetic algorithms, Technical report, Caltech Concurrent Computation Program 158–79. California Institute of Technology, Pasadena, California (1989)
Morgans, R.C., Howard, C.Q., Zander, A.C., Hansen, C.H., Murphy, D.J.: Derivative free optimisation in engineering and acoustics, 14th International Congress on Sound & Vibration, 1–8 (2007)
Neidinger, R.D.: Introduction to automatic differentiation and MATLAB object-oriented programming. SIAM Rev. 52, 545–563 (2010). https://doi.org/10.1137/080743627
Neumaier, A.: MCS: global optimization by multilevel coordinate search. (2000)https://www.mat.univie.ac.at/~neum/software/mcs/
NEOS Server.: (2021). https://neos-server.org/neos/
Nocedal, J., Wright, S.J.: Numerical optimization. Springer-Verlag, Berlin (1999)
Ortega, J.M., Rheinboldt, W.C.: Iteration solution of nonlinear equations in several variables. SIAM, Philadelphia, PA (2000)
Regis, R.G., Shoemaker, C.A.: A quasi-multistart framework for global optimization of expensive functions using response surface models. J. Glob. Optim. 56, 1719–1753 (2013)
Rios, L.M., Sahinidis, N.V.: Derivative-free optimization: a review of algorithms and comparison of software implementations. J. Glob. Optim. 56, 1247–1293 (2013)
Rosenbrock, H.H.: An automatic method for finding the greatest or least value of a function, Comput. J. 3, 175–184 (1960). Aailable online at http://comjnl.oxfordjournals.org/content/3/3/175.full.pdf
Shampine, L.F., Gladwell, I., Thompson, S.: Solving ODEs with MATLAB. Cambridge University Press, Cambridge (2003)
Sahinidis, N. V.: BARON 21.1.13: Global optimization of mixed-integer nonlinear programs, user’s manual (2021). Available at https://minlp.com/downloads/docs/baron manual.pdf
Surjanovic, S., Bingham, D.: Virtual library of simulation experiments: test functions and datasets, available at http://www.sfu.ca/~ssurjano, January (2020)
Sun, J., Garibaldi, J.M., Krasnogor, N., Zhang, Q.: An intelligent muti-restart memetic algorithm for box constrained global optimisation. Evol. Comput. 21, 107–147 (2014)
Sergeyev, Y.D., Kvasov, D.E., Mukhametzhanov, M.S.: On the efficiency of nature-inspired metaheuristics in expensive global optimization with limited budget. Sci. Rep. 8, 1–9 (2018)
Sergeyev, Y.D., Kvasov, D.E.: Deterministic global optimization: an introduction to the diagonal approach, Springer, (2017)
Sergeyev, Y.D., Kvasov, D.E., Mukhametzhanov, M.S.: Operational zones for comparing metaheuristic and deterministic one-dimensional global optimization algorithms. Math. Comput. Simul. 141, 96–109 (2017)
Sergeyev, Y.D., Kvasov, D.E.: A deterministic global optimization using smooth diagonal auxiliary functions. Commun. Nonlinear Sci. 21, 99–111 (2015)
Sun, W.-Y., Yuan, Y.-X.: Optimization theory and methods: nonlinear programming. Springer, Berlin (2006)
Tanabe, K.: Continuous Newton-Raphson method for solving an underdetermined system of nonlinear equations. Nonlinear Anal. 3, 495–503 (1979)
Tawarmalani, M., Sahinidis, N.V.: A polyhedral branch-and-cut approach to global optimization. Math. Program. 103, 225–249 (2005)
Teughels, A., Roeck, G.. De., Suykens, J.A.K.: Global optimization by coupled local minimizers and its application to FE model updating. Comput. Sturct. 81, 2337–2351 (2003)
Ugray, Z., Lasdon, L., Plummer, J., Glover, F., Kelly, J., Marti, R.: Scatter search and local NLP solvers: a multistart framework for global optimization. INFORMS J. Comput. 19, 328–340 (2007)
Willkomm, J., Vehreschild, A.: The ADiMat handbook, (2013). http://adimat.sc.informatik.tu-darmstadt.de/doc/
Willkomm, J., Bischof, C.H., Bücker, H.M.: A new user interface for ADiMat: toward accurate and efficient derivatives of MATLAB programmes with ease of use. Int. J. Comput. Sci. Eng. 9, 408–415 (2014)
Xu, J., Nannariello, J., Fricke, F.R.: Optimising flat-walled multi-layered anechoic linings using evolutionary algorithms. Appl. Acoust. 65, 1009–1026 (2004)
Yuan, Y.-X.: Trust region algorithms for nonlinear equations. Information 1, 7–20 (1998)
Yuan, Y.-X.: Recent advances in trust region algorithms. Math. Program. 151, 249–281 (2015)
Žilinskas, A., Gillard, J., Scammell, M., Zhiglijavsky, A.: Multistart with early termination of descents. J. Glob. Optim. 79, 447–462 (2021)
Acknowledgements
The authors are grateful to Prof. Jonathan Eckstein for the suggestion of the comparison to Couenne and Prof. Nick Sahinidis for the suggestion of the comparison to the derivative-free optimization methods such as MCS and CMA-ES. The authors are grateful to two anonymous referees for their comments and suggestions which greatly improve the presentation of this paper.
Funding
This work was supported in part by Grants 61876199 and 62376036 from National Natural Science Foundation of China, Grant YBWL2011085 from Huawei Technologies Co., Ltd., and Grant YJCB2011003HI from the Innovation Research Program of Huawei Technologies Co., Ltd..
Author information
Authors and Affiliations
Contributions
**n-long Luo and Hang **ao wrote the main manuscript text and Sen Zhang performed all test problems and prepared Figs. 1–4. All authors reviewed the manuscript.
Corresponding author
Ethics declarations
Ethical approval
Not applicable
Conflict of interest
The authors declare no competing interests.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix A Tables of Numerical Results
Appendix A Tables of Numerical Results
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Luo, Xl., **ao, H. & Zhang, S. Continuation Newton methods with deflation techniques for global optimization problems. Numer Algor (2024). https://doi.org/10.1007/s11075-024-01768-1
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s11075-024-01768-1
Keywords
- Continuation Newton method
- Deflation technique
- Derivative-free method
- Automatic differentiation
- Global optimization
- Genetic evolution