Learning General Gaussian Kernel Hyperparameters for SVR
Abstract
We propose a new method for general gaussian kernel hyperparameters optimization for support vector regression. The hyperparameters are constrained to lie on a differentiable manifold. The proposed optimization technique is based on a gradient-like descent algorithm adapted to the geometrical structure of the manifold of symmetric positive-definite matrices. We compare the performance of our approach with the classical support vector regression on real world data sets. Experiments demonstrate that the optimization improves prediction accuracy and reduces the number of support vectors.