TY - JOUR
T1 - SVR+RVR
T2 - A robust sparse kernel method for regression
AU - Zhang, Gai Ying
AU - Guo, Gao
AU - Zhang, Jiang She
PY - 2010/10
Y1 - 2010/10
N2 - Support vector machine (SVM) and relevance vector machine (RVM) are two state of the art kernel learning methods. But both methods have some disadvantages: although SVM is very robust against outliers, it makes unnecessarily liberal use of basis functions since the number of support vectors required typically grows linearly with the size of the training set; on the other hand the solution of RVM is astonishingly sparse, but its performance deteriorates significantly when the observations are contaminated by outliers. In this paper, we present a combination of SVM and RVM for regression problems, in which the two methods are concatenated: firstly, we train a support vector regression (SVR) machine on the full training set; then a relevance vector regression (RVR) machine is trained only on a subset consisting of support vectors, but whose target values are replaced by the predictions of SVR. Using this combination, we overcome the drawbacks of SVR and RVR. Experiments demonstrate SVR+RVR is both very sparse and robust.
AB - Support vector machine (SVM) and relevance vector machine (RVM) are two state of the art kernel learning methods. But both methods have some disadvantages: although SVM is very robust against outliers, it makes unnecessarily liberal use of basis functions since the number of support vectors required typically grows linearly with the size of the training set; on the other hand the solution of RVM is astonishingly sparse, but its performance deteriorates significantly when the observations are contaminated by outliers. In this paper, we present a combination of SVM and RVM for regression problems, in which the two methods are concatenated: firstly, we train a support vector regression (SVR) machine on the full training set; then a relevance vector regression (RVR) machine is trained only on a subset consisting of support vectors, but whose target values are replaced by the predictions of SVR. Using this combination, we overcome the drawbacks of SVR and RVR. Experiments demonstrate SVR+RVR is both very sparse and robust.
KW - Support vector machine (SVM)
KW - outlier
KW - relevance vector machine (RVM)
KW - relevance vector regression (RVR)
KW - robustness
KW - sparseness
KW - support vector regression (SVR)
UR - https://www.scopus.com/pages/publications/77958130158
U2 - 10.1142/S0218213010000340
DO - 10.1142/S0218213010000340
M3 - 文章
AN - SCOPUS:77958130158
SN - 0218-2130
VL - 19
SP - 627
EP - 645
JO - International Journal on Artificial Intelligence Tools
JF - International Journal on Artificial Intelligence Tools
IS - 5
ER -