Kernel second-order discriminants versus support vector machines - Archive ouverte HAL Access content directly
Conference Papers Year :

Kernel second-order discriminants versus support vector machines

(1) , (1) , (1)
1

Abstract

Support vector machines (SVMs) are the most well known nonlinear classifiers based on the Mercer kernel trick. They generally lead to very sparse solutions that ensure good generalization performance. Recently, S. Mika et al. (see Advances in Neural Networks for Signal Processing, p.41-8, 1999) proposed a new nonlinear technique based on the kernel trick and the Fisher criterion: the nonlinear kernel Fisher discriminant (KFD). Experiments show that KFD is competitive with SVM classifiers. Nevertheless, it can be shown that there exist distributions such that even though the two classes are linearly separable, the Fisher linear discriminant has an error probability close to 1. We propose an alternative strategy based on Mercer kernels that consists in picking the optimum nonlinear receiver in the sense of the best second-order criterion. We also present a strategy for controlling the complexity of the resulting classifier. Finally, we compare this new method with SVM and KFD.
Not file

Dates and versions

hal-02861479 , version 1 (09-06-2020)

Identifiers

Cite

Fahed Abdallah, Cédric Richard, Régis Lengellé. Kernel second-order discriminants versus support vector machines. International Conference on Acoustics, Speech and Signal Processing (ICASSP'03), Apr 2003, Hong Kong, China. pp.VI-149-52, ⟨10.1109/ICASSP.2003.1201640⟩. ⟨hal-02861479⟩
8 View
0 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More