We are improving our search experience. To check which content you have full access to, or for advanced search, go back to the old search.

Search

Please fill in this field.
Filters applied:

Search Results

Showing 1-20 of 264 results
  1. Normalized Wolfe-Powell-type local minimax method for finding multiple unstable solutions of nonlinear elliptic PDEs

    The local minimax method (LMM) proposed by Li and Zhou (2001) and Li and Zhou (2002) is an efficient method to solve nonlinear elliptic partial...

    Wei Liu, Ziqing **e, Wenfan Yi in Science China Mathematics
    Article 21 April 2023
  2. A modified Broyden family algorithm with global convergence under a weak Wolfe-Powell line search for unconstrained nonconvex problems

    The Quasi-Newton method is one of the most effective methods using the first derivative for solving all unconstrained optimization problems. The...

    Gonglin Yuan, Zhan Wang, Pengyuan Li in Calcolo
    Article 19 October 2020
  3. Global Convergence of a Modified BFGS Method with Simultaneous Correction of Direction and Step Size

    The standard BFGS method is a famous quasi-Newton method for solving optimization problems. For convex functions, the study of the convergence of...

    Article 16 August 2023
  4. An extended version of the memoryless DFP algorithm with the sufficient descent property

    The classic memoryless DFP (Davidon–Fletcher–Powell) search direction is extended by embedding an additional term, as an uncomplicated scheme to...

    Arezoo Bakhshinejad, Saman Babaie–Kafaki in Bollettino dell'Unione Matematica Italiana
    Article 25 March 2024
  5. A modified HZ conjugate gradient algorithm without gradient Lipschitz continuous condition for non convex functions

    As we know, conjugate gradient methods are widely used for unconstrained optimization because of the advantages of simple structure and small...

    Gonglin Yuan, Ailun Jian, ... Jiajia Yu in Journal of Applied Mathematics and Computing
    Article 21 March 2022
  6. The Projection Technique for Two Open Problems of Unconstrained Optimization Problems

    There are two problems for nonconvex functions under the weak Wolfe–Powell line search in unconstrained optimization problems. The first one is the...

    Gonglin Yuan, **aoliang Wang, Zhou Sheng in Journal of Optimization Theory and Applications
    Article 15 July 2020
  7. Quasi-Newton Methods

    In this chapter we give a short introduction to Quasi-Newton methods (also known as variable metric methods or secant methods), which constitute an...
    Luigi Grippo, Marco Sciandrone in Introduction to Methods for Nonlinear Optimization
    Chapter 2023
  8. A conjugate gradient algorithm without Lipchitz continuity and its applications

    An improved conjugate gradient algorithm is proposed that does not rely on the line search rule and automatically achieves sufficient descent and...

    Huiyun Liu, Haishan Feng in Journal of Applied Mathematics and Computing
    Article 03 May 2024
  9. A J-symmetric quasi-newton method for minimax problems

    Minimax problems have gained tremendous attentions across the optimization and machine learning community recently. In this paper, we introduce a new...

    Azam Asl, Haihao Lu, **wen Yang in Mathematical Programming
    Article 20 April 2023
  10. Global convergence of a nonmonotone Broyden family method for nonconvex unconstrained minimization

    In this paper, a nonmonotone Broyden family method is presented for the unconstrained optimization problems. The proposed line search technique is...

    Gonglin Yuan, Zhan Wang, Pengyuan Li in Computational and Applied Mathematics
    Article 04 August 2022
  11. A globally convergent improved BFGS method for generalized Nash equilibrium problems

    In this article, we consider a class of Generalized Nash Equilibrium Problems (GNEPs) and solve it using one of the most effective quasi-Newton...

    Abhishek Singh, Debdas Ghosh in SeMA Journal
    Article 24 January 2023
  12. Conjugate Direction Methods

    We consider conjugate direction methods, Conjugate direction method a class of algorithms originally introduced as iterative methods for solving...
    Luigi Grippo, Marco Sciandrone in Introduction to Methods for Nonlinear Optimization
    Chapter 2023
  13. Optimization Algorithms: An Overview

    Optimization algorithms is a vast research area in its own right, with multiple strands. In this chapter we do not attempt anything close to a...
    Vivek S. Borkar, K. S. Mallikarjuna Rao in Elementary Convexity with Optimization
    Chapter 2023
  14. Quasi-Newton Methods

    The idea of these methods is not to use the Hessian ∇2f(xk) of the minimizing function in the current point at every iteration, but instead to use an...
    Chapter 2022
  15. A new descent spectral Polak–Ribière–Polyak method based on the memoryless BFGS update

    Spectral conjugate gradient methods are considered as an efficient family of conjugate gradient methods to solve unconstrained optimization problems....

    Maryam Khoshsimaye-Bargard, Ali Ashrafi in Computational and Applied Mathematics
    Article 19 October 2021
  16. Fundamentals on Unconstrained Optimization. Stepsize Computation

    Unconstrained optimization consists of minimizing a function which depends on a number of real variables without any restrictions on their values....
    Chapter 2022
  17. A modified secant equation quasi-Newton method for unconstrained optimization

    One of the most prominent iterative approaches for solving unconstrained optimization problems is the quasi-Newton method. Their fast convergence and...

    Basim A. Hassan, Issam A. R. Moghrabi in Journal of Applied Mathematics and Computing
    Article 31 May 2022
  18. A short note on an adaptive damped Newton method for strongly monotone and Lipschitz continuous operator equations

    We consider the damped Newton method for strongly monotone and Lipschitz continuous operator equations in a variational setting. We provide a very...

    Pascal Heid in Archiv der Mathematik
    Article Open access 07 May 2023
  19. Optimization

    Optimization is the task of making the most of what you have. Mathematically, this is turned into finding either the maximum or minimum of a function...
    Chapter 2022
  20. Numerical Methods of Optimization

    The numerical methods of optimization start with optimizing functions of one variable, bisection, Fibonacci, and Newton. Then, functions of several...
    Jean-Pierre Corriou in Numerical Methods and Optimization
    Chapter 2021
Did you find what you were looking for? Share feedback.