An inexact ADMM for separable nonconvex and nonsmooth optimization
Document Type
Article
Publication Date
3-1-2025
Abstract
An inexact alternating direction method of multiplies (I-ADMM) with an expansion linesearch step was developed for solving a family of separable minimization problems subject to linear constraints, where the objective function is the sum of a smooth but possibly nonconvex function and a possibly nonsmooth nonconvex function. Global convergence and linear convergence rate of the I-ADMM were established under proper conditions while inexact relative error criterion was used for solving the subproblems. In addition, a unified proximal gradient (UPG) method with momentum acceleration was proposed for solving the smooth but possibly nonconvex subproblem. This UPG method guarantees global convergence and will automatically reduce to an optimal accelerated gradient method when the smooth function in the objective is convex. Our numerical experiments on solving nonconvex quadratic programming problems and sparse optimization problems from statistical learning show that the proposed I-ADMM is very effective compared with other state-of-the-art algorithms in the literature.
Publication Source (Journal or Book title)
Computational Optimization and Applications
First Page
445
Last Page
479
Recommended Citation
Bai, J., Zhang, M., & Zhang, H. (2025). An inexact ADMM for separable nonconvex and nonsmooth optimization. Computational Optimization and Applications, 90 (2), 445-479. https://doi.org/10.1007/s10589-024-00643-y