Abstract
The parameter optimization is one of the main study directions of SVM. Recently, a gradient descent algorithm based on RM bound is developed, which can tune multiple parameters of SVM with squared cost function automatically and efficiently. But till now, few issues related to practical use of this type SVM are discussed. The performance of SVM with squared cost function on pattern recognition is studied and compared with the standard SVM. The results indicate that for balanced data, both SVMs have almost the same classifying accuracy, but the SVM with square cost function possess more support vectors and smaller optimized parameters than standard SVM. For unbalanced data, when the unbalanced degree between two classes of training samples increases, the classifying accuracy of the SVM with squared cost function decreases rapidly. The experiments also show that the gradient descent algorithm based on RM bound is not suitable for some data. Some analysis on properties of the SVM with square cost function are also included. Finally, a pruning algorithm based on golden section rule is proposed and applied to increase the sparseness of SVM with squared cost function. Using this algorithm, the number of the redundant support vectors can be reduced to one or zero.
| Original language | English |
|---|---|
| Pages (from-to) | 982-989 |
| Number of pages | 8 |
| Journal | Jisuanji Xuebao/Chinese Journal of Computers |
| Volume | 26 |
| Issue number | 8 |
| State | Published - Aug 2003 |
Keywords
- Cost function
- Support vector machine
- Support vectors pruning