Skip to main content

and
  1. No Access

    Article

    Convergence rates of training deep neural networks via alternating minimization methods

    Training deep neural networks (DNNs) is an important and challenging optimization problem in machine learning due to its non-convexity and non-separable structure. The alternating minimization (AM) approaches ...

    **tao Xu, Chenglong Bao, Wenxun **ng in Optimization Letters (2024)

  2. Article

    Open Access

    A minority of final stacks yields superior amplitude in single-particle cryo-EM

    Cryogenic electron microscopy (cryo-EM) is widely used to determine near-atomic resolution structures of biological macromolecules. Due to the low signal-to-noise ratio, cryo-EM relies on averaging many images...

    Jianying Zhu, Qi Zhang, Hui Zhang, Zuoqiang Shi, Mingxu Hu in Nature Communications (2023)

  3. No Access

    Article

    A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds

    This paper is devoted to studying an augmented Lagrangian method for solving a class of manifold optimization problems, which have nonsmooth objective functions and nonlinear constraints. Under the constant po...

    Yuhao Zhou, Chenglong Bao, Chao Ding, Jun Zhu in Mathematical Programming (2023)

  4. Chapter and Conference Paper

    A Convergent Incoherent Dictionary Learning Algorithm for Sparse Coding

    Recently, sparse coding has been widely used in many applications ranging from image recovery to pattern recognition. The low mutual coherence of a dictionary is an important property that ensures the optimali...

    Chenglong Bao, Yuhui Quan, Hui Ji in Computer Vision – ECCV 2014 (2014)