跳到主要导航 跳到搜索 跳到主要内容

Generalized symmetric ADMM for separable convex optimization

  • Jianchao Bai
  • , Jicheng Li
  • , Fengmin Xu
  • , Hongchao Zhang

科研成果: 期刊稿件文章同行评审

80 引用 (Scopus)

摘要

The alternating direction method of multipliers (ADMM) has been proved to be effective for solving separable convex optimization subject to linear constraints. In this paper, we propose a generalized symmetric ADMM (GS-ADMM), which updates the Lagrange multiplier twice with suitable stepsizes, to solve the multi-block separable convex programming. This GS-ADMM partitions the data into two group variables so that one group consists of p block variables while the other has q block variables, where p≥ 1 and q≥ 1 are two integers. The two grouped variables are updated in a Gauss–Seidel scheme, while the variables within each group are updated in a Jacobi scheme, which would make it very attractive for a big data setting. By adding proper proximal terms to the subproblems, we specify the domain of the stepsizes to guarantee that GS-ADMM is globally convergent with a worst-case O(1 / t) ergodic convergence rate. It turns out that our convergence domain of the stepsizes is significantly larger than other convergence domains in the literature. Hence, the GS-ADMM is more flexible and attractive on choosing and using larger stepsizes of the dual variable. Besides, two special cases of GS-ADMM, which allows using zero penalty terms, are also discussed and analyzed. Compared with several state-of-the-art methods, preliminary numerical experiments on solving a sparse matrix minimization problem in the statistical learning show that our proposed method is effective and promising.

源语言英语
页(从-至)129-170
页数42
期刊Computational Optimization and Applications
70
1
DOI
出版状态已出版 - 1 5月 2018

学术指纹

探究 'Generalized symmetric ADMM for separable convex optimization' 的科研主题。它们共同构成独一无二的指纹。

引用此