Nyström Regularization for Time Series Forecasting

Research output: Contribution to journalArticlepeer-review

6 Scopus citations

Abstract

This paper focuses on learning rate analysis of Nyström regularization with sequential subsampling for τ-mixing time series. Using a recently developed Banach-valued Bernstein inequality for τ-mixing sequences and an integral operator approach based on second-order decomposition, we succeed in deriving almost optimal learning rates of Nyström regularization with sequential sub-sampling for τ-mixing time series. A series of numerical experiments are carried out to verify our theoretical results, showing the excellent learning performance of Nyström regularization with sequential sub-sampling in learning massive time series data. All these results extend the applicable range of Nyström regularization from i.i.d. samples to non-i.i.d. sequences.

Original languageEnglish
Article number312
JournalJournal of Machine Learning Research
Volume23
StatePublished - 1 Oct 2022

Keywords

  • Nyström regularization
  • Sub-sampling
  • Time series forecasting
  • τ-mixing process

Fingerprint

Dive into the research topics of 'Nyström Regularization for Time Series Forecasting'. Together they form a unique fingerprint.

Cite this