Experimental study on the performance of support vector machine with squared cost function

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

The parameter optimization is one of the main study directions of SVM. Recently, a gradient descent algorithm based on RM bound is developed, which can tune multiple parameters of SVM with squared cost function automatically and efficiently. But till now, few issues related to practical use of this type SVM are discussed. The performance of SVM with squared cost function on pattern recognition is studied and compared with the standard SVM. The results indicate that for balanced data, both SVMs have almost the same classifying accuracy, but the SVM with square cost function possess more support vectors and smaller optimized parameters than standard SVM. For unbalanced data, when the unbalanced degree between two classes of training samples increases, the classifying accuracy of the SVM with squared cost function decreases rapidly. The experiments also show that the gradient descent algorithm based on RM bound is not suitable for some data. Some analysis on properties of the SVM with square cost function are also included. Finally, a pruning algorithm based on golden section rule is proposed and applied to increase the sparseness of SVM with squared cost function. Using this algorithm, the number of the redundant support vectors can be reduced to one or zero.

Original languageEnglish
Pages (from-to)982-989
Number of pages8
JournalJisuanji Xuebao/Chinese Journal of Computers
Volume26
Issue number8
StatePublished - Aug 2003

Keywords

  • Cost function
  • Support vector machine
  • Support vectors pruning

Fingerprint

Dive into the research topics of 'Experimental study on the performance of support vector machine with squared cost function'. Together they form a unique fingerprint.

Cite this