Skip to main content

and
  1. No Access

    Article

    Convergence analysis of block majorize-minimize subspace approach

    We consider the minimization of a differentiable Lipschitz gradient but non necessarily convex, function F defined on \({\mathbb {R}}^N\) ...

    Emilie Chouzenoux, Jean-Baptiste Fest in Optimization Letters (2024)

  2. No Access

    Article

    SABRINA: A Stochastic Subspace Majorization-Minimization Algorithm

    A wide class of problems involves the minimization of a coercive and differentiable function F on \({\mathbb {R}}^N\) ...

    Emilie Chouzenoux, Jean-Baptiste Fest in Journal of Optimization Theory and Applications (2022)

  3. No Access

    Article

    A random block-coordinate Douglas–Rachford splitting method with low computational complexity for binary logistic regression

    In this paper, we propose a new optimization algorithm for sparse logistic regression based on a stochastic version of the Douglas–Rachford splitting method. Our algorithm performs both function and variable s...

    Luis M. Briceño-Arias, Giovanni Chierchia in Computational Optimization and Applications (2019)

  4. No Access

    Article

    A block coordinate variable metric forward–backward algorithm

    A number of recent works have emphasized the prominent role played by the Kurdyka-Łojasiewicz inequality for proving the convergence of iterative algorithms solving possibly nonsmooth/nonconvex optimization pr...

    Emilie Chouzenoux, Jean-Christophe Pesquet in Journal of Global Optimization (2016)

  5. No Access

    Article

    Variable Metric Forward–Backward Algorithm for Minimizing the Sum of a Differentiable Function and a Convex Function

    We consider the minimization of a function G defined on \({ \mathbb{R} } ^{N}\) , which is the sum of a (not necessarily...

    Emilie Chouzenoux, Jean-Christophe Pesquet in Journal of Optimization Theory and Applica… (2014)