Initializing back propagation networks with prototypes - Université de technologie de Troyes Accéder directement au contenu
Article Dans Une Revue Neural Networks Année : 1993

Initializing back propagation networks with prototypes

Résumé

This paper addresses the problem of initializing the weights in back propagation networks with one hidden layer. The proposed method relies on the use of reference patterns, or prototypes, and on a transformation which maps each vector in the original feature space onto a unit-length vector in a space with one additional dimension. This scheme applies to pattern recognition tasks, as well as to the approximation of continuous functions. Issues related to the preprocessing of input patterns and to the generation of prototypes are discussed, and an algorithm for building appropriate prototypes in the continuous case is described. Also examined is the relationship between this approach and the theory of radial basis functions. Finally, simulation results are presented, showing that initializing back propagation networks with prototypes generally results in (a) drastic reductions in training time, (b) improved robustness against local minima, and (c) better generalization.

Dates et versions

hal-02861437 , version 1 (09-06-2020)

Identifiants

Citer

Thierry Denoeux, Régis Lengellé. Initializing back propagation networks with prototypes. Neural Networks, 1993, 6 (3), pp.351-363. ⟨10.1016/0893-6080(93)90003-F⟩. ⟨hal-02861437⟩
23 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More