TY - JOUR
T1 - Generalized symmetric ADMM for separable convex optimization
AU - Bai, Jianchao
AU - Li, Jicheng
AU - Xu, Fengmin
AU - Zhang, Hongchao
N1 - Publisher Copyright:
© 2017, Springer Science+Business Media, LLC, part of Springer Nature.
PY - 2018/5/1
Y1 - 2018/5/1
N2 - The alternating direction method of multipliers (ADMM) has been proved to be effective for solving separable convex optimization subject to linear constraints. In this paper, we propose a generalized symmetric ADMM (GS-ADMM), which updates the Lagrange multiplier twice with suitable stepsizes, to solve the multi-block separable convex programming. This GS-ADMM partitions the data into two group variables so that one group consists of p block variables while the other has q block variables, where p≥ 1 and q≥ 1 are two integers. The two grouped variables are updated in a Gauss–Seidel scheme, while the variables within each group are updated in a Jacobi scheme, which would make it very attractive for a big data setting. By adding proper proximal terms to the subproblems, we specify the domain of the stepsizes to guarantee that GS-ADMM is globally convergent with a worst-case O(1 / t) ergodic convergence rate. It turns out that our convergence domain of the stepsizes is significantly larger than other convergence domains in the literature. Hence, the GS-ADMM is more flexible and attractive on choosing and using larger stepsizes of the dual variable. Besides, two special cases of GS-ADMM, which allows using zero penalty terms, are also discussed and analyzed. Compared with several state-of-the-art methods, preliminary numerical experiments on solving a sparse matrix minimization problem in the statistical learning show that our proposed method is effective and promising.
AB - The alternating direction method of multipliers (ADMM) has been proved to be effective for solving separable convex optimization subject to linear constraints. In this paper, we propose a generalized symmetric ADMM (GS-ADMM), which updates the Lagrange multiplier twice with suitable stepsizes, to solve the multi-block separable convex programming. This GS-ADMM partitions the data into two group variables so that one group consists of p block variables while the other has q block variables, where p≥ 1 and q≥ 1 are two integers. The two grouped variables are updated in a Gauss–Seidel scheme, while the variables within each group are updated in a Jacobi scheme, which would make it very attractive for a big data setting. By adding proper proximal terms to the subproblems, we specify the domain of the stepsizes to guarantee that GS-ADMM is globally convergent with a worst-case O(1 / t) ergodic convergence rate. It turns out that our convergence domain of the stepsizes is significantly larger than other convergence domains in the literature. Hence, the GS-ADMM is more flexible and attractive on choosing and using larger stepsizes of the dual variable. Besides, two special cases of GS-ADMM, which allows using zero penalty terms, are also discussed and analyzed. Compared with several state-of-the-art methods, preliminary numerical experiments on solving a sparse matrix minimization problem in the statistical learning show that our proposed method is effective and promising.
KW - Alternating direction method of multipliers
KW - Complexity
KW - Global convergence
KW - Multiple blocks
KW - Parameter convergence domain
KW - Separable convex programming
KW - Statistical learning
UR - https://www.scopus.com/pages/publications/85035117429
U2 - 10.1007/s10589-017-9971-0
DO - 10.1007/s10589-017-9971-0
M3 - 文章
AN - SCOPUS:85035117429
SN - 0926-6003
VL - 70
SP - 129
EP - 170
JO - Computational Optimization and Applications
JF - Computational Optimization and Applications
IS - 1
ER -