Skip to Main content Skip to Navigation
Conference papers

Learning General Gaussian Kernel Hyperparameters for SVR

Abstract : We propose a new method for general gaussian kernel hyperparameters optimization for support vector regression. The hyperparameters are constrained to lie on a differentiable manifold. The proposed optimization technique is based on a gradient-like descent algorithm adapted to the geometrical structure of the manifold of symmetric positive-definite matrices. We compare the performance of our approach with the classical support vector regression on real world data sets. Experiments demonstrate that the optimization improves prediction accuracy and reduces the number of support vectors.
Document type :
Conference papers
Complete list of metadatas

https://hal-utt.archives-ouvertes.fr/hal-02861454
Contributor : Jean-Baptiste Vu Van <>
Submitted on : Tuesday, June 9, 2020 - 8:06:55 AM
Last modification on : Tuesday, June 16, 2020 - 4:04:02 PM

Links full text

Identifiers

Collections

Citation

Fahed Abdallah, Hichem Snoussi, H. Laanaya, Régis Lengellé. Learning General Gaussian Kernel Hyperparameters for SVR. First International Geometric Science of Information Conference, GSI 2013, Aug 2013, Paris, France. pp.677-684, ⟨10.1007/978-3-642-40020-9_75⟩. ⟨hal-02861454⟩

Share

Metrics

Record views

15