An Improved Training Algorithm for Nonlinear Kernel Discriminants - Archive ouverte HAL Access content directly
Journal Articles IEEE Transactions on Signal Processing Year : 2004

An Improved Training Algorithm for Nonlinear Kernel Discriminants

(1) , (1) , (1)
1

Abstract

A simple method to derive nonlinear discriminants is to map the samples into a high-dimensional feature space F using a nonlinear function and then to perform a linear discriminant analysis in F. Clearly, if F is a very high, or even infinitely, dimensional space, designing such a receiver may be a computationally intractable problem. However, using Mercer kernels, this problem can be solved without explicitly mapping the data to F. Recently, a powerful method of obtaining nonlinear kernel Fisher discriminants (KFDs) has been proposed, and very promising results were reported when compared with the other state-of-the-art classification techniques. In this paper, we present an extension of the KFD method that is also based on Mercer kernels. Our approach, which is called the nonlinear kernel second-order discriminant (KSOD), consists of determining a nonlinear receiver via optimization of a general form of second-order measures of performance. We also propose a complexity control procedure in order to improve the performance of these classifiers when few training data are available. Finally, simulations compare our approach with the KFD method.
Not file

Dates and versions

hal-02297580 , version 1 (26-09-2019)

Identifiers

Cite

Fahed Abdallah, Cédric Richard, Régis Lengellé. An Improved Training Algorithm for Nonlinear Kernel Discriminants. IEEE Transactions on Signal Processing, 2004, 52 (10), pp.2798-2806. ⟨10.1109/TSP.2004.834346⟩. ⟨hal-02297580⟩
13 View
0 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More