An Improved Training Algorithm for Nonlinear Kernel Discriminants

Abstract : A simple method to derive nonlinear discriminants is to map the samples into a high-dimensional feature space F using a nonlinear function and then to perform a linear discriminant analysis in F. Clearly, if F is a very high, or even infinitely, dimensional space, designing such a receiver may be a computationally intractable problem. However, using Mercer kernels, this problem can be solved without explicitly mapping the data to F. Recently, a powerful method of obtaining nonlinear kernel Fisher discriminants (KFDs) has been proposed, and very promising results were reported when compared with the other state-of-the-art classification techniques. In this paper, we present an extension of the KFD method that is also based on Mercer kernels. Our approach, which is called the nonlinear kernel second-order discriminant (KSOD), consists of determining a nonlinear receiver via optimization of a general form of second-order measures of performance. We also propose a complexity control procedure in order to improve the performance of these classifiers when few training data are available. Finally, simulations compare our approach with the KFD method.
Document type :
Journal articles
Complete list of metadatas

https://hal-utt.archives-ouvertes.fr/hal-02297580
Contributor : Jean-Baptiste Vu Van <>
Submitted on : Thursday, September 26, 2019 - 11:36:56 AM
Last modification on : Friday, September 27, 2019 - 1:27:39 AM

Identifiers

Collections

Citation

Fahed Abdallah, Cédric Richard, Régis Lengellé. An Improved Training Algorithm for Nonlinear Kernel Discriminants. IEEE Transactions on Signal Processing, Institute of Electrical and Electronics Engineers, 2004, 52 (10), pp.2798-2806. ⟨10.1109/TSP.2004.834346⟩. ⟨hal-02297580⟩

Share

Metrics

Record views

6