SVR+RVR: A robust sparse kernel method for regression

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Support vector machine (SVM) and relevance vector machine (RVM) are two state of the art kernel learning methods. But both methods have some disadvantages: although SVM is very robust against outliers, it makes unnecessarily liberal use of basis functions since the number of support vectors required typically grows linearly with the size of the training set; on the other hand the solution of RVM is astonishingly sparse, but its performance deteriorates significantly when the observations are contaminated by outliers. In this paper, we present a combination of SVM and RVM for regression problems, in which the two methods are concatenated: firstly, we train a support vector regression (SVR) machine on the full training set; then a relevance vector regression (RVR) machine is trained only on a subset consisting of support vectors, but whose target values are replaced by the predictions of SVR. Using this combination, we overcome the drawbacks of SVR and RVR. Experiments demonstrate SVR+RVR is both very sparse and robust.

Original languageEnglish
Pages (from-to)627-645
Number of pages19
JournalInternational Journal on Artificial Intelligence Tools
Volume19
Issue number5
DOIs
StatePublished - Oct 2010

Keywords

  • Support vector machine (SVM)
  • outlier
  • relevance vector machine (RVM)
  • relevance vector regression (RVR)
  • robustness
  • sparseness
  • support vector regression (SVR)

Fingerprint

Dive into the research topics of 'SVR+RVR: A robust sparse kernel method for regression'. Together they form a unique fingerprint.

Cite this