We are improving our search experience. To check which content you have full access to, or for advanced search, go back to the old search.

Search

Please fill in this field.
Filters applied:

Search Results

Showing 21-40 of 7,115 results
  1. Exact linesearch limited-memory quasi-Newton methods for minimizing a quadratic function

    The main focus in this paper is exact linesearch methods for minimizing a quadratic function whose Hessian is positive definite. We give a class of...

    David Ek, Anders Forsgren in Computational Optimization and Applications
    Article Open access 28 April 2021
  2. Rates of superlinear convergence for classical quasi-Newton methods

    We study the local convergence of classical quasi-Newton methods for nonlinear optimization. Although it was well established a long time ago that...

    Anton Rodomanov, Yurii Nesterov in Mathematical Programming
    Article Open access 08 February 2021
  3. New Results on Superlinear Convergence of Classical Quasi-Newton Methods

    We present a new theoretical analysis of local superlinear convergence of classical quasi-Newton methods from the convex Broyden class. As a result,...

    Anton Rodomanov, Yurii Nesterov in Journal of Optimization Theory and Applications
    Article Open access 09 January 2021
  4. Inexact proximal memoryless quasi-Newton methods based on the Broyden family for minimizing composite functions

    This study considers a proximal Newton-type method to solve the minimization of a composite function that is the sum of a smooth nonconvex function...

    Shummin Nakayama, Yasushi Narushima, Hiroshi Yabe in Computational Optimization and Applications
    Article 06 February 2021
  5. A new semismooth Newton method for solving finite-dimensional quasi-variational inequalities

    In this paper, we consider the numerical method for solving finite-dimensional quasi-variational inequalities with both equality and inequality...

    Shui-Lian **e, Zhe Sun, Hong-Ru Xu in Journal of Inequalities and Applications
    Article Open access 28 July 2021
  6. Secant penalized BFGS: a noise robust quasi-Newton method via penalizing the secant condition

    In this paper, we introduce a new variant of the BFGS method designed to perform well when gradient measurements are corrupted by noise. We show that...

    Brian Irwin, Eldad Haber in Computational Optimization and Applications
    Article 09 January 2023
  7. Methods for Nonlinear Equations

    In this chapter we consider solution methods for nonlinear equations, such as Newton type methods, Quasi-Newton methods and fixed point methods,...
    Luigi Grippo, Marco Sciandrone in Introduction to Methods for Nonlinear Optimization
    Chapter 2023
  8. A stochastic extra-step quasi-Newton method for nonsmooth nonconvex optimization

    In this paper, a novel stochastic extra-step quasi-Newton method is developed to solve a class of nonsmooth nonconvex composite optimization...

    Minghan Yang, Andre Milzarek, ... Tong Zhang in Mathematical Programming
    Article 13 March 2021
  9. Methods for Large-Scale Optimization

    In this chapter we present methods for solving large scale nonlinear equations and nonlinear unconstrained optimization problems. In particular, we...
    Luigi Grippo, Marco Sciandrone in Introduction to Methods for Nonlinear Optimization
    Chapter 2023
  10. Greedy PSB methods with explicit superlinear convergence

    Recently, Rodomanov and Nesterov proposed a class of greedy quasi-Newton methods and established the first explicit local superlinear convergence...

    Zhen-Yuan Ji, Yu-Hong Dai in Computational Optimization and Applications
    Article 07 June 2023
  11. A Quasi-Newton Method with Wolfe Line Searches for Multiobjective Optimization

    We propose a BFGS method with Wolfe line searches for unconstrained multiobjective optimization problems. The algorithm is well defined even for...

    L. F. Prudente, D. R. Souza in Journal of Optimization Theory and Applications
    Article 20 July 2022
  12. Continuous Projection Generalized Extra-Gradient Quasi-Newton Second-Order Method for Solving Saddle Point Problems

    Abstract

    The paper presents a study of a method for solving saddle point problems for convex-concave smooth functions with Lipschitz partial gradients...

    Article 01 May 2022
  13. A hybrid semismooth quasi-Newton method for nonsmooth optimal control with PDEs

    We propose a semismooth Newton-type method for nonsmooth optimal control problems. Its particular feature is the combination of a quasi-Newton method...

    Florian Mannel, Armin Rund in Optimization and Engineering
    Article Open access 18 July 2020
  14. A smoothing quasi-Newton method for solving general second-order cone complementarity problems

    Recently, there are much interests in studying smoothing Newton method for solving montone second-order cone complementarity problem (SOCCP) or...

    **gyong Tang, **chuan Zhou in Journal of Global Optimization
    Article 24 November 2020
  15. SQP Methods

    In this chapter we describe the essential features of the techniques known as Sequential Quadratic Programming (SQP) methods, which can be viewed as...
    Luigi Grippo, Marco Sciandrone in Introduction to Methods for Nonlinear Optimization
    Chapter 2023
  16. Large-scale quasi-Newton trust-region methods with low-dimensional linear equality constraints

    We propose two limited-memory BFGS (L-BFGS) trust-region methods for large-scale optimization with linear equality constraints. The methods are...

    Johannes J. Brust, Roummel F. Marcia, Cosmin G. Petra in Computational Optimization and Applications
    Article 05 September 2019
  17. A dense initialization for limited-memory quasi-Newton methods

    We consider a family of dense initializations for limited-memory quasi-Newton methods. The proposed initialization exploits an...

    Johannes Brust, Oleg Burdakov, ... Roummel F. Marcia in Computational Optimization and Applications
    Article 29 May 2019
  18. Numerical Methods of Optimization

    The numerical methods of optimization start with optimizing functions of one variable, bisection, Fibonacci, and Newton. Then, functions of several...
    Jean-Pierre Corriou in Numerical Methods and Optimization
    Chapter 2021
  19. Quasi-Newton Methods

    The Quasi-Newton methodsQuasi-Newton methods do not compute the Hessian of nonlinear functions. The Hessian is updated by analyzing successive...
    Shashi Kant Mishra, Bhagwat Ram in Introduction to Unconstrained Optimization with R
    Chapter 2019
  20. Inexact proximal DC Newton-type method for nonconvex composite functions

    We consider a class of difference-of-convex (DC) optimization problems where the objective function is the sum of a smooth function and a possibly...

    Shummin Nakayama, Yasushi Narushima, Hiroshi Yabe in Computational Optimization and Applications
    Article 15 September 2023
Did you find what you were looking for? Share feedback.