Search
Search Results
-
Exact linesearch limited-memory quasi-Newton methods for minimizing a quadratic function
The main focus in this paper is exact linesearch methods for minimizing a quadratic function whose Hessian is positive definite. We give a class of...
-
Rates of superlinear convergence for classical quasi-Newton methods
We study the local convergence of classical quasi-Newton methods for nonlinear optimization. Although it was well established a long time ago that...
-
New Results on Superlinear Convergence of Classical Quasi-Newton Methods
We present a new theoretical analysis of local superlinear convergence of classical quasi-Newton methods from the convex Broyden class. As a result,...
-
Inexact proximal memoryless quasi-Newton methods based on the Broyden family for minimizing composite functions
This study considers a proximal Newton-type method to solve the minimization of a composite function that is the sum of a smooth nonconvex function...
-
A new semismooth Newton method for solving finite-dimensional quasi-variational inequalities
In this paper, we consider the numerical method for solving finite-dimensional quasi-variational inequalities with both equality and inequality...
-
Secant penalized BFGS: a noise robust quasi-Newton method via penalizing the secant condition
In this paper, we introduce a new variant of the BFGS method designed to perform well when gradient measurements are corrupted by noise. We show that...
-
Methods for Nonlinear Equations
In this chapter we consider solution methods for nonlinear equations, such as Newton type methods, Quasi-Newton methods and fixed point methods,... -
A stochastic extra-step quasi-Newton method for nonsmooth nonconvex optimization
In this paper, a novel stochastic extra-step quasi-Newton method is developed to solve a class of nonsmooth nonconvex composite optimization...
-
Methods for Large-Scale Optimization
In this chapter we present methods for solving large scale nonlinear equations and nonlinear unconstrained optimization problems. In particular, we... -
Greedy PSB methods with explicit superlinear convergence
Recently, Rodomanov and Nesterov proposed a class of greedy quasi-Newton methods and established the first explicit local superlinear convergence...
-
A Quasi-Newton Method with Wolfe Line Searches for Multiobjective Optimization
We propose a BFGS method with Wolfe line searches for unconstrained multiobjective optimization problems. The algorithm is well defined even for...
-
Continuous Projection Generalized Extra-Gradient Quasi-Newton Second-Order Method for Solving Saddle Point Problems
AbstractThe paper presents a study of a method for solving saddle point problems for convex-concave smooth functions with Lipschitz partial gradients...
-
A hybrid semismooth quasi-Newton method for nonsmooth optimal control with PDEs
We propose a semismooth Newton-type method for nonsmooth optimal control problems. Its particular feature is the combination of a quasi-Newton method...
-
A smoothing quasi-Newton method for solving general second-order cone complementarity problems
Recently, there are much interests in studying smoothing Newton method for solving montone second-order cone complementarity problem (SOCCP) or...
-
SQP Methods
In this chapter we describe the essential features of the techniques known as Sequential Quadratic Programming (SQP) methods, which can be viewed as... -
Large-scale quasi-Newton trust-region methods with low-dimensional linear equality constraints
We propose two limited-memory BFGS (L-BFGS) trust-region methods for large-scale optimization with linear equality constraints. The methods are...
-
A dense initialization for limited-memory quasi-Newton methods
We consider a family of dense initializations for limited-memory quasi-Newton methods. The proposed initialization exploits an...
-
Numerical Methods of Optimization
The numerical methods of optimization start with optimizing functions of one variable, bisection, Fibonacci, and Newton. Then, functions of several... -
Quasi-Newton Methods
The Quasi-Newton methodsQuasi-Newton methods do not compute the Hessian of nonlinear functions. The Hessian is updated by analyzing successive... -
Inexact proximal DC Newton-type method for nonconvex composite functions
We consider a class of difference-of-convex (DC) optimization problems where the objective function is the sum of a smooth function and a possibly...