Derivative-Free Methods for Unconstrained Optimization

  • Chapter
  • First Online:
Introduction to Methods for Nonlinear Optimization

Part of the book series: UNITEXT ((UNITEXTMAT,volume 152))

  • 1063 Accesses

Abstract

In this chapter we introduce some classes of optimization methods that do not use derivatives of the objective function. After a short introduction, we consider unconstrained minimization problems and we describe some of the best known derivative-free methods. Then we study a class of globally convergent methods based on the inexact derivative-free linesearch techniques already introduced in Chap. 10. Finally, we describe some techniques employing gradient approximations and we outline the use of model-based methods. Derivative-free methods for problems with box constraints will be presented in Chap. 20. Derivative-free nonmonotone methods will be introduced in Chap. 24.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (Brazil)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 64.99
Price excludes VAT (Brazil)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 84.99
Price excludes VAT (Brazil)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free ship** worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Typically of the order \(\sqrt {\eta }\), where η is the machine precision.

References

  1. Bazaraa, M., Sherali, H., Shetty, C.: Nonlinear Programming, Theory, and Applications, 2nd edn. Wiley, New York (1993)

    MATH  Google Scholar 

  2. Choi, T.D., Kelley, C.T.: Superlinear convergence and implicit filtering. SIAM J. Optim. 10, 1149–1162 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  3. Conn, A.R., Gould, N.I.M., Toint, P.L.: Trust-Region Methods. MPS/SIAM Series on Optimization. SIAM, Philadelphia (2000)

    Google Scholar 

  4. Conn, A.R., Scheinberg, K., Vicente, L.N.: Introduction to Derivative-Free Optimization. SIAM, Philadelphia (2009)

    Book  MATH  Google Scholar 

  5. Custodio, A.L., Dennis, J.E., Jr., Vicente, L.N.: Using simplex gradients of nonsmooth functions in direct search methods. IMA J. Numer. Anal. 20, 770–784 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  6. Grippo, L., Lampariello, F., Lucidi, S.: Global convergence and stabilization of unconstrained minimization methods without derivatives. J. Optim. Theory Appl. 56, 385–406 (1988)

    Article  MathSciNet  MATH  Google Scholar 

  7. Grippo, L., Rinaldi, F.: A class of derivative-free nonmonotone optimization algorithms employing coordinate rotations and gradient approximations. Comput. Optim. Appl. 60, 1–33 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  8. Hooke, R., Jeeves, T.A.: Direct search solution of numerical and statistical problems. J. Assoc. Comput. Mach. 8, 212–221 (1961)

    Article  MATH  Google Scholar 

  9. Kelley, C.T.: Detection and remediation of stagnation in the Nelder-Mead algorithm using sufficient decrease condition. SIAM J. Optim. 10, 43–55 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  10. Kelley, C.T.: Iterative Methods for Optimization. SIAM Publications, Philadelphia (1999)

    Book  MATH  Google Scholar 

  11. Kolda, T.G., Lewis, R.M., Torczon, V.: Optimization by direct search: new perspective on some classical and modern methods. SIAM Rev. 45, 385–482 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  12. Lucidi, S., Sciandrone, M.: On the global convergence of derivative free methods for unconstrained optimization. SIAM J. Optim. 13, 97–116 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  13. McKinnon, K.I.: Convergence of the Nelder-Mead simplex method to a nonstationary point. SIAM J. Optim. 9, 148–158 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  14. Nelder, J.A., Mead, R.: A simplex method for function minimization. Comput. J. 8, 308–313 (1965)

    Article  MathSciNet  MATH  Google Scholar 

  15. Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer, New York (2006)

    MATH  Google Scholar 

  16. Polak, E.: Optimization. Springer, New York (1997)

    Book  MATH  Google Scholar 

  17. Powell, M.J.D.: An efficient method for finding the minimum of a function of several variables without calculating derivatives. Comput. J. 7, 155–162 (1964)

    Article  MathSciNet  MATH  Google Scholar 

  18. Powell, M.J.D.: UOBYQA: unconstrained optimization by quadratic approximation. Math. Program. Ser. B 92, 555–582 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  19. Rosenbrock, H.H.: An automatic method for finding the greatest or the least value of a function. Comput. J. 3, 175–184 (1960)

    Article  MathSciNet  Google Scholar 

  20. Torczon, V.: On the convergence of pattern search algorithms. SIAM J. Optim. 7, 1–25 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  21. Tseng, P.: Fortified-descent simplicial search method: a general approach. SIAM J. Optim. 10, 269–288 (1999)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2023 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Grippo, L., Sciandrone, M. (2023). Derivative-Free Methods for Unconstrained Optimization. In: Introduction to Methods for Nonlinear Optimization. UNITEXT(), vol 152. Springer, Cham. https://doi.org/10.1007/978-3-031-26790-1_19

Download citation

Publish with us

Policies and ethics

Navigation