Search
Search Results
-
Two Improved Nonlinear Conjugate Gradient Methods with the Strong Wolfe Line Search
Two improved nonlinear conjugate gradient methods are proposed by using the second inequality of the strong Wolfe line search. Under usual...
-
An improvement of the Goldstein line search
This paper introduces
CLS , a new line search along an arbitrary smooth search path, that starts at the current iterate tangentially to a descent... -
Line Search Methods
In this chapter we describe some of the best known line search algorithms employed in the unconstrained minimization of smooth functions. We will... -
A Quasi-Newton Method with Wolfe Line Searches for Multiobjective Optimization
We propose a BFGS method with Wolfe line searches for unconstrained multiobjective optimization problems. The algorithm is well defined even for...
-
Normalized Wolfe-Powell-type local minimax method for finding multiple unstable solutions of nonlinear elliptic PDEs
The local minimax method (LMM) proposed by Li and Zhou (2001) and Li and Zhou (2002) is an efficient method to solve nonlinear elliptic partial...
-
The Frank-Wolfe Algorithm: A Short Introduction
In this paper we provide an introduction to the Frank-Wolfe algorithm, a method for smooth convex optimization in the presence of (relatively)...
-
Linear convergence of Frank–Wolfe for rank-one matrix recovery without strong convexity
We consider convex optimization problems which are widely used as convex relaxations for low-rank matrix recovery problems. In particular, in several...
-
An away-step Frank–Wolfe algorithm for constrained multiobjective optimization
In this paper, we propose and analyze an away-step Frank–Wolfe algorithm designed for solving multiobjective optimization problems over polytopes. We...
-
Avoiding bad steps in Frank-Wolfe variants
The study of Frank-Wolfe (FW) variants is often complicated by the presence of different kinds of “good” and “bad” steps. In this article, we aim to...
-
A data driven Dantzig–Wolfe decomposition framework
We face the issue of finding alternative paradigms for the resolution of generic Mixed Integer Programs (MIP), by considering the perspective option...
-
Frank–Wolfe-type methods for a class of nonconvex inequality-constrained problems
The Frank–Wolfe (FW) method, which implements efficient linear oracles that minimize linear approximations of the objective function over a fixed ...
-
Linewalker: line search for black box derivative-free optimization and surrogate model construction
This paper describes a simple, but effective sampling method for optimizing and learning a discrete approximation (or surrogate) of a...
-
-
Restarting Frank–Wolfe: Faster Rates under Hölderian Error Bounds
Conditional gradient algorithms (aka Frank–Wolfe algorithms) form a classical set of methods for constrained smooth convex minimization due to their...
-
Riemannian Optimization via Frank-Wolfe Methods
We study projection-free methods for constrained Riemannian optimization. In particular, we propose a Riemannian Frank-Wolfe ( RFW ) method that...
-
Generalized self-concordant analysis of Frank–Wolfe algorithms
Projection-free optimization via different variants of the Frank–Wolfe method has become one of the cornerstones of large scale optimization for...
-
A generalized Frank–Wolfe method with “dual averaging” for strongly convex composite optimization
We propose a simple variant of the generalized Frank–Wolfe method for solving strongly convex composite optimization problems, by introducing an...
-
An extended version of the memoryless DFP algorithm with the sufficient descent property
The classic memoryless DFP (Davidon–Fletcher–Powell) search direction is extended by embedding an additional term, as an uncomplicated scheme to...
-
A modified Fletcher-Reeves conjugate gradient method for unconstrained optimization with applications in image restoration
The Fletcher-Reeves (FR) method is widely recognized for its drawbacks, such as generating unfavorable directions and taking small steps, which can...
-
Two sufficient descent spectral conjugate gradient algorithms for unconstrained optimization with application
This study introduces a new modification of the conjugate gradient (CG) method (IMRMIL). Additionally, two spectral CG algorithms (SCG1 and SCG2) are...