Abstract
In this paper, we generalize the Canonical Correlation Analysis (CCA)) for discrimination to yield a new nonlinear learning machine by using kernel methods. It is named as Kernel Canonical Correlation Discriminant Analysis (KCCDA), which is a powerful technique for extracting nonlinear features from high-dimensional data sets. To overcome the problems of computation complexity, an adaptive learning algorithm for KCCDA is proposed based on online sparsification. The extensive experiments on artificial and real-world data sets demonstrate the competitiveness of KCCDA and our adaptive learning algorithm. Finally, from the theoretical viewpoint we prove that KCCDA is identical to the Kernel Fisher Discriminant analysis (KFD) except for an unimportant scale factor.
| Original language | English |
|---|---|
| Pages (from-to) | 789-795 |
| Number of pages | 7 |
| Journal | Jisuanji Xuebao/Chinese Journal of Computers |
| Volume | 27 |
| Issue number | 6 |
| State | Published - Jun 2004 |
Keywords
- Adaptive learning algorithm
- Canonical correlation analysis
- Discrimination
- Kernel Fisher Discriminant analysis (KFD)
- Kernel methods