Abstract
Spectral algorithms have been widely used and studied in learning theory and inverse problems. This paper is concerned with distributed spectral algorithms, for handling big data, based on a divide-and-conquer approach. We present a learning theory for these distributed kernel-based learning algorithms in a regression framework including nice error bounds and optimal minimax learning rates achieved by means of a novel integral operator approach and a second order decomposition of inverse operators. Our quantitative estimates are given in terms of regularity of the regression function, effective dimension of the reproducing kernel Hilbert space, and qualification of the filter function of the spectral algorithm. They do not need any eigenfunction or noise conditions and are better than the existing results even for the classical family of spectral algorithms.
| Original language | English |
|---|---|
| Article number | 074009 |
| Journal | Inverse Problems |
| Volume | 33 |
| Issue number | 7 |
| DOIs | |
| State | Published - 21 Jun 2017 |
| Externally published | Yes |
Keywords
- distributed learning
- integral operator
- learning rate
- spectral algorithm
Fingerprint
Dive into the research topics of 'Learning theory of distributed spectral algorithms'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver