跳到主要导航 跳到搜索 跳到主要内容

SVR+RVR: A robust sparse kernel method for regression

科研成果: 期刊稿件文章同行评审

2 引用 (Scopus)

摘要

Support vector machine (SVM) and relevance vector machine (RVM) are two state of the art kernel learning methods. But both methods have some disadvantages: although SVM is very robust against outliers, it makes unnecessarily liberal use of basis functions since the number of support vectors required typically grows linearly with the size of the training set; on the other hand the solution of RVM is astonishingly sparse, but its performance deteriorates significantly when the observations are contaminated by outliers. In this paper, we present a combination of SVM and RVM for regression problems, in which the two methods are concatenated: firstly, we train a support vector regression (SVR) machine on the full training set; then a relevance vector regression (RVR) machine is trained only on a subset consisting of support vectors, but whose target values are replaced by the predictions of SVR. Using this combination, we overcome the drawbacks of SVR and RVR. Experiments demonstrate SVR+RVR is both very sparse and robust.

源语言英语
页(从-至)627-645
页数19
期刊International Journal on Artificial Intelligence Tools
19
5
DOI
出版状态已出版 - 10月 2010

学术指纹

探究 'SVR+RVR: A robust sparse kernel method for regression' 的科研主题。它们共同构成独一无二的指纹。

引用此