F-SVC: A simple and fast training algorithm soft margin Support Vector Classification
Résumé
Support vector machines have obtained much success in machine learning. But their training require to solve a quadratic optimization problem so that training time increases dramatically with the increase of the training set size. Hence, standard SVM have difficulty in handling large scale problems. In this paper, we present a new fast training algorithm for soft margin support vector classification. This algorithm searches for successive efficient feasible directions. A heuristic for searching the direction maximally correlated with the gradient is applied and the optimum step size of the optimization algorithm is analytically determined. Furthermore the solution, gradient and objective function are recursively obtained. In order to deal with large scale problems, the Gram matrix has not to be stored. Our iterative algorithm fully exploits quadratic functions properties. F-SVC is very simple, easy to implement and able to perform on large data sets.