Search
Search Results
-
Normalized Wolfe-Powell-type local minimax method for finding multiple unstable solutions of nonlinear elliptic PDEs
The local minimax method (LMM) proposed by Li and Zhou (2001) and Li and Zhou (2002) is an efficient method to solve nonlinear elliptic partial...
-
A modified Broyden family algorithm with global convergence under a weak Wolfe-Powell line search for unconstrained nonconvex problems
The Quasi-Newton method is one of the most effective methods using the first derivative for solving all unconstrained optimization problems. The...
-
Global Convergence of a Modified BFGS Method with Simultaneous Correction of Direction and Step Size
The standard BFGS method is a famous quasi-Newton method for solving optimization problems. For convex functions, the study of the convergence of...
-
An extended version of the memoryless DFP algorithm with the sufficient descent property
The classic memoryless DFP (Davidon–Fletcher–Powell) search direction is extended by embedding an additional term, as an uncomplicated scheme to...
-
A modified HZ conjugate gradient algorithm without gradient Lipschitz continuous condition for non convex functions
As we know, conjugate gradient methods are widely used for unconstrained optimization because of the advantages of simple structure and small...
-
The Projection Technique for Two Open Problems of Unconstrained Optimization Problems
There are two problems for nonconvex functions under the weak Wolfe–Powell line search in unconstrained optimization problems. The first one is the...
-
Quasi-Newton Methods
In this chapter we give a short introduction to Quasi-Newton methods (also known as variable metric methods or secant methods), which constitute an... -
A conjugate gradient algorithm without Lipchitz continuity and its applications
An improved conjugate gradient algorithm is proposed that does not rely on the line search rule and automatically achieves sufficient descent and...
-
A J-symmetric quasi-newton method for minimax problems
Minimax problems have gained tremendous attentions across the optimization and machine learning community recently. In this paper, we introduce a new...
-
Global convergence of a nonmonotone Broyden family method for nonconvex unconstrained minimization
In this paper, a nonmonotone Broyden family method is presented for the unconstrained optimization problems. The proposed line search technique is...
-
A globally convergent improved BFGS method for generalized Nash equilibrium problems
In this article, we consider a class of Generalized Nash Equilibrium Problems (GNEPs) and solve it using one of the most effective quasi-Newton...
-
Conjugate Direction Methods
We consider conjugate direction methods, Conjugate direction method a class of algorithms originally introduced as iterative methods for solving... -
Optimization Algorithms: An Overview
Optimization algorithms is a vast research area in its own right, with multiple strands. In this chapter we do not attempt anything close to a... -
Quasi-Newton Methods
The idea of these methods is not to use the Hessian ∇2f(xk) of the minimizing function in the current point at every iteration, but instead to use an... -
A new descent spectral Polak–Ribière–Polyak method based on the memoryless BFGS update
Spectral conjugate gradient methods are considered as an efficient family of conjugate gradient methods to solve unconstrained optimization problems....
-
Fundamentals on Unconstrained Optimization. Stepsize Computation
Unconstrained optimization consists of minimizing a function which depends on a number of real variables without any restrictions on their values.... -
A modified secant equation quasi-Newton method for unconstrained optimization
One of the most prominent iterative approaches for solving unconstrained optimization problems is the quasi-Newton method. Their fast convergence and...
-
A short note on an adaptive damped Newton method for strongly monotone and Lipschitz continuous operator equations
We consider the damped Newton method for strongly monotone and Lipschitz continuous operator equations in a variational setting. We provide a very...
-
Optimization
Optimization is the task of making the most of what you have. Mathematically, this is turned into finding either the maximum or minimum of a function... -
Numerical Methods of Optimization
The numerical methods of optimization start with optimizing functions of one variable, bisection, Fibonacci, and Newton. Then, functions of several...