An Improved Training Algorithm for Nonlinear Kernel Discriminants - Université de technologie de Troyes Accéder directement au contenu
Article Dans Une Revue IEEE Transactions on Signal Processing Année : 2004

An Improved Training Algorithm for Nonlinear Kernel Discriminants

Résumé

A simple method to derive nonlinear discriminants is to map the samples into a high-dimensional feature space F using a nonlinear function and then to perform a linear discriminant analysis in F. Clearly, if F is a very high, or even infinitely, dimensional space, designing such a receiver may be a computationally intractable problem. However, using Mercer kernels, this problem can be solved without explicitly mapping the data to F. Recently, a powerful method of obtaining nonlinear kernel Fisher discriminants (KFDs) has been proposed, and very promising results were reported when compared with the other state-of-the-art classification techniques. In this paper, we present an extension of the KFD method that is also based on Mercer kernels. Our approach, which is called the nonlinear kernel second-order discriminant (KSOD), consists of determining a nonlinear receiver via optimization of a general form of second-order measures of performance. We also propose a complexity control procedure in order to improve the performance of these classifiers when few training data are available. Finally, simulations compare our approach with the KFD method.
Fichier non déposé

Dates et versions

hal-02297580 , version 1 (26-09-2019)

Identifiants

Citer

Fahed Abdallah, Cédric Richard, Régis Lengellé. An Improved Training Algorithm for Nonlinear Kernel Discriminants. IEEE Transactions on Signal Processing, 2004, 52 (10), pp.2798-2806. ⟨10.1109/TSP.2004.834346⟩. ⟨hal-02297580⟩
13 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More