We are improving our search experience. To check which content you have full access to, or for advanced search, go back to the old search.

Search

Please fill in this field.
Filters applied:

Search Results

Showing 1-20 of 698 results
  1. Two Improved Nonlinear Conjugate Gradient Methods with the Strong Wolfe Line Search

    Two improved nonlinear conjugate gradient methods are proposed by using the second inequality of the strong Wolfe line search. Under usual...

    **bao Jian, Pengjie Liu, ... Bo He in Bulletin of the Iranian Mathematical Society
    Article 15 October 2021
  2. An improvement of the Goldstein line search

    This paper introduces CLS , a new line search along an arbitrary smooth search path, that starts at the current iterate tangentially to a descent...

    Arnold Neumaier, Morteza Kimiaei in Optimization Letters
    Article 05 April 2024
  3. Line Search Methods

    In this chapter we describe some of the best known line search algorithms employed in the unconstrained minimization of smooth functions. We will...
    Luigi Grippo, Marco Sciandrone in Introduction to Methods for Nonlinear Optimization
    Chapter 2023
  4. A Quasi-Newton Method with Wolfe Line Searches for Multiobjective Optimization

    We propose a BFGS method with Wolfe line searches for unconstrained multiobjective optimization problems. The algorithm is well defined even for...

    L. F. Prudente, D. R. Souza in Journal of Optimization Theory and Applications
    Article 20 July 2022
  5. Normalized Wolfe-Powell-type local minimax method for finding multiple unstable solutions of nonlinear elliptic PDEs

    The local minimax method (LMM) proposed by Li and Zhou (2001) and Li and Zhou (2002) is an efficient method to solve nonlinear elliptic partial...

    Wei Liu, Ziqing **e, Wenfan Yi in Science China Mathematics
    Article 21 April 2023
  6. The Frank-Wolfe Algorithm: A Short Introduction

    In this paper we provide an introduction to the Frank-Wolfe algorithm, a method for smooth convex optimization in the presence of (relatively)...

    Article Open access 13 December 2023
  7. Linear convergence of Frank–Wolfe for rank-one matrix recovery without strong convexity

    We consider convex optimization problems which are widely used as convex relaxations for low-rank matrix recovery problems. In particular, in several...

    Dan Garber in Mathematical Programming
    Article 12 May 2022
  8. An away-step Frank–Wolfe algorithm for constrained multiobjective optimization

    In this paper, we propose and analyze an away-step Frank–Wolfe algorithm designed for solving multiobjective optimization problems over polytopes. We...

    Douglas S. Gonçalves, Max L. N. Gonçalves, Jefferson G. Melo in Computational Optimization and Applications
    Article 07 May 2024
  9. Avoiding bad steps in Frank-Wolfe variants

    The study of Frank-Wolfe (FW) variants is often complicated by the presence of different kinds of “good” and “bad” steps. In this article, we aim to...

    Francesco Rinaldi, Damiano Zeffiro in Computational Optimization and Applications
    Article 27 November 2022
  10. A data driven Dantzig–Wolfe decomposition framework

    We face the issue of finding alternative paradigms for the resolution of generic Mixed Integer Programs (MIP), by considering the perspective option...

    Saverio Basso, Alberto Ceselli in Mathematical Programming Computation
    Article Open access 02 November 2022
  11. Frank–Wolfe-type methods for a class of nonconvex inequality-constrained problems

    The Frank–Wolfe (FW) method, which implements efficient linear oracles that minimize linear approximations of the objective function over a fixed ...

    Liaoyuan Zeng, Yongle Zhang, ... **aozhou Wang in Mathematical Programming
    Article 03 February 2024
  12. Linewalker: line search for black box derivative-free optimization and surrogate model construction

    This paper describes a simple, but effective sampling method for optimizing and learning a discrete approximation (or surrogate) of a...

    Dimitri J. Papageorgiou, Jan Kronqvist, Krishnan Kumaran in Optimization and Engineering
    Article 21 February 2024
  13. Restarting Frank–Wolfe: Faster Rates under Hölderian Error Bounds

    Conditional gradient algorithms (aka Frank–Wolfe algorithms) form a classical set of methods for constrained smooth convex minimization due to their...

    Thomas Kerdreux, Alexandre d’Aspremont, Sebastian Pokutta in Journal of Optimization Theory and Applications
    Article 30 January 2022
  14. Riemannian Optimization via Frank-Wolfe Methods

    We study projection-free methods for constrained Riemannian optimization. In particular, we propose a Riemannian Frank-Wolfe ( RFW ) method that...

    Melanie Weber, Suvrit Sra in Mathematical Programming
    Article Open access 14 July 2022
  15. Generalized self-concordant analysis of Frank–Wolfe algorithms

    Projection-free optimization via different variants of the Frank–Wolfe method has become one of the cornerstones of large scale optimization for...

    Pavel Dvurechensky, Kamil Safin, ... Mathias Staudigl in Mathematical Programming
    Article Open access 29 January 2022
  16. A generalized Frank–Wolfe method with “dual averaging” for strongly convex composite optimization

    We propose a simple variant of the generalized Frank–Wolfe method for solving strongly convex composite optimization problems, by introducing an...

    Renbo Zhao, Qiuyun Zhu in Optimization Letters
    Article Open access 07 November 2022
  17. An extended version of the memoryless DFP algorithm with the sufficient descent property

    The classic memoryless DFP (Davidon–Fletcher–Powell) search direction is extended by embedding an additional term, as an uncomplicated scheme to...

    Arezoo Bakhshinejad, Saman Babaie–Kafaki in Bollettino dell'Unione Matematica Italiana
    Article 25 March 2024
  18. A modified Fletcher-Reeves conjugate gradient method for unconstrained optimization with applications in image restoration

    The Fletcher-Reeves (FR) method is widely recognized for its drawbacks, such as generating unfavorable directions and taking small steps, which can...

    Zainab Hassan Ahmed, Mohamed Hbaib, Khalil K. Abbo in Applications of Mathematics
    Article 07 June 2024
  19. Two sufficient descent spectral conjugate gradient algorithms for unconstrained optimization with application

    This study introduces a new modification of the conjugate gradient (CG) method (IMRMIL). Additionally, two spectral CG algorithms (SCG1 and SCG2) are...

    Sulaiman Mohammed Ibrahim, Nasiru Salihu in Optimization and Engineering
    Article 13 June 2024
Did you find what you were looking for? Share feedback.