跳到主要导航 跳到搜索 跳到主要内容

PROXSGD: TRAINING STRUCTURED NEURAL NETWORKS UNDER REGULARIZATION AND CONSTRAINTS

  • Yang Yang
  • , Yaxiong Yuan
  • , Avraam Chatzimichailidis
  • , Ruud J.G. van Sloun
  • , Lei Lei
  • , Symeon Chatzinotas

科研成果: 会议稿件论文同行评审

10 引用 (Scopus)

摘要

In this paper, we consider the problem of training structured neural networks (NN) with nonsmooth regularization (e.g. ℓ1-norm) and constraints (e.g. interval constraints). We formulate training as a constrained nonsmooth nonconvex optimization problem, and propose a convergent proximal-type stochastic gradient descent (ProxSGD) algorithm. We show that under properly selected learning rates, with probability 1, every limit point of the sequence generated by the proposed ProxSGD algorithm is a stationary point. Finally, to support the theoretical analysis and demonstrate the flexibility of ProxSGD, we show by extensive numerical tests how ProxSGD can be used to train either sparse or binary neural networks through an adequate selection of the regularization function and constraint set.

源语言英语
出版状态已出版 - 2020
已对外发布
活动8th International Conference on Learning Representations, ICLR 2020 - Addis Ababa, 埃塞俄比亚
期限: 30 4月 2020 → …

会议

会议8th International Conference on Learning Representations, ICLR 2020
国家/地区埃塞俄比亚
Addis Ababa
时期30/04/20 → …

学术指纹

探究 'PROXSGD: TRAINING STRUCTURED NEURAL NETWORKS UNDER REGULARIZATION AND CONSTRAINTS' 的科研主题。它们共同构成独一无二的指纹。

引用此