Skip to main navigation Skip to search Skip to main content

Sparse online regression algorithm with insensitive loss functions

Research output: Contribution to journalArticlepeer-review

Abstract

Online learning is an efficient approach in machine learning and statistics, which iteratively updates models upon the observation of a sequence of training examples. A representative online learning algorithm is the online gradient descent, which has found wide applications due to its low complexity and scalability to large datasets. Kernel-based learning methods have been proven to be quite successful in dealing with nonlinearity in the data and multivariate optimization. In this paper we present a class of kernel-based online gradient descent algorithm for addressing regression problems, which generates sparse estimators in an iterative way to reduce the algorithmic complexity for training streaming datasets and model selection in large-scale learning scenarios. In the setting of support vector regression (SVR), we design the sparse online learning algorithm by introducing a sequence of insensitive distance-based loss functions. We prove consistency and error bounds quantifying the generalization performance of such algorithms under mild conditions. The theoretical results demonstrate the interplay between statistical accuracy and sparsity property during learning processes. We show that the insensitive parameter plays a crucial role in providing sparsity as well as fast convergence rates. The numerical experiments also support our theoretical results.

Original languageEnglish
Article number105316
JournalJournal of Multivariate Analysis
Volume202
DOIs
StatePublished - Jul 2024

Keywords

  • Insensitive loss
  • Online learning
  • Quantile regression
  • Reproducing kernel Hilbert space
  • Sparsity

Fingerprint

Dive into the research topics of 'Sparse online regression algorithm with insensitive loss functions'. Together they form a unique fingerprint.

Cite this