Search
Search Results
-
Variance reduced moving balls approximation method for smooth constrained minimization problems
In this paper, we consider the problem of minimizing the sum of a large number of smooth convex functions subject to a complicated constraint set...
-
Block coordinate descent for smooth nonconvex constrained minimization
At each iteration of a block coordinate descent method one minimizes an approximation of the objective function with respect to a generally small set...
-
On complexity and convergence of high-order coordinate descent algorithms for smooth nonconvex box-constrained minimization
Coordinate descent methods have considerable impact in global optimization because global (or, at least, almost global) minimization is affordable...
-
Domain Decomposition for Non-smooth (in Particular TV) Minimization
Domain decomposition is one of the most efficient techniques to derive efficient methods for large-scale problems. In this chapter such decomposition... -
A Newton Frank–Wolfe method for constrained self-concordant minimization
We develop a new Newton Frank–Wolfe algorithm to solve a class of constrained self-concordant minimization problems using linear minimization oracles...
-
Majorization-minimization-based Levenberg–Marquardt method for constrained nonlinear least squares
A new Levenberg–Marquardt (LM) method for solving nonlinear least squares problems with convex constraints is described. Various versions of the LM...
-
-
Iterative regularization for constrained minimization formulations of nonlinear inverse problems
In this paper we study the formulation of inverse problems as constrained minimization problems and their iterative solution by gradient or Newton...
-
Complementary composite minimization, small gradients in general norms, and applications
Composite minimization is a powerful framework in large-scale convex optimization, based on decoupling of the objective function into terms with...
-
Stability of Minimization Problems and the Error Bound Condition
It is well known that Error Bound conditions provide some (usually linear or sublinear) rate of convergence for gradient descent methods in...
-
-
Direct Minimization of the Canham–Helfrich Energy on Generalized Gauss Graphs
The existence of minimizers of the Canham–Helfrich functional in the setting of generalized Gauss graphs is proved. As a first step, the...
-
Constrained composite optimization and augmented Lagrangian methods
We investigate finite-dimensional constrained structured optimization problems, featuring composite objective functions and set-membership...
-
The exact projective penalty method for constrained optimization
A new exact projective penalty method is proposed for the equivalent reduction of constrained optimization problems to nonsmooth unconstrained ones....
-
Complexity of an inexact proximal-point penalty method for constrained smooth non-convex optimization
In this paper, an inexact proximal-point penalty method is studied for constrained optimization problems, where the objective function is non-convex,...
-
A Unified Primal-Dual Algorithm Framework for Inequality Constrained Problems
In this paper, we propose a unified primal-dual algorithm framework based on the augmented Lagrangian function for composite convex problems with...
-
Smooth over-parameterized solvers for non-smooth structured optimization
Non-smooth optimization is a core ingredient of many imaging or machine learning pipelines. Non-smoothness encodes structural constraints on the...
-
A fast primal-dual algorithm via dynamical system with variable mass for linearly constrained convex optimization
We aim to solve the linearly constrained convex optimization problem whose objective function is the sum of a differentiable function and a...
-
Efficient Convex Optimization for Non-convex Non-smooth Image Restoration
This work focuses on recovering images from various forms of corruption, for which a challenging non-smooth, non-convex optimization model is...