跳到主要导航 跳到搜索 跳到主要内容

Sparse online regression algorithm with insensitive loss functions

  • Wuhan University

科研成果: 期刊稿件文章同行评审

摘要

Online learning is an efficient approach in machine learning and statistics, which iteratively updates models upon the observation of a sequence of training examples. A representative online learning algorithm is the online gradient descent, which has found wide applications due to its low complexity and scalability to large datasets. Kernel-based learning methods have been proven to be quite successful in dealing with nonlinearity in the data and multivariate optimization. In this paper we present a class of kernel-based online gradient descent algorithm for addressing regression problems, which generates sparse estimators in an iterative way to reduce the algorithmic complexity for training streaming datasets and model selection in large-scale learning scenarios. In the setting of support vector regression (SVR), we design the sparse online learning algorithm by introducing a sequence of insensitive distance-based loss functions. We prove consistency and error bounds quantifying the generalization performance of such algorithms under mild conditions. The theoretical results demonstrate the interplay between statistical accuracy and sparsity property during learning processes. We show that the insensitive parameter plays a crucial role in providing sparsity as well as fast convergence rates. The numerical experiments also support our theoretical results.

源语言英语
文章编号105316
期刊Journal of Multivariate Analysis
202
DOI
出版状态已出版 - 7月 2024

学术指纹

探究 'Sparse online regression algorithm with insensitive loss functions' 的科研主题。它们共同构成独一无二的指纹。

引用此