Skip to Main content Skip to Navigation
Conference papers

F-SVC: A simple and fast training algorithm soft margin Support Vector Classification

Abstract : Support vector machines have obtained much success in machine learning. But their training require to solve a quadratic optimization problem so that training time increases dramatically with the increase of the training set size. Hence, standard SVM have difficulty in handling large scale problems. In this paper, we present a new fast training algorithm for soft margin support vector classification. This algorithm searches for successive efficient feasible directions. A heuristic for searching the direction maximally correlated with the gradient is applied and the optimum step size of the optimization algorithm is analytically determined. Furthermore the solution, gradient and objective function are recursively obtained. In order to deal with large scale problems, the Gram matrix has not to be stored. Our iterative algorithm fully exploits quadratic functions properties. F-SVC is very simple, easy to implement and able to perform on large data sets.
Document type :
Conference papers
Complete list of metadatas

https://hal-utt.archives-ouvertes.fr/hal-02316560
Contributor : Jean-Baptiste Vu Van <>
Submitted on : Tuesday, October 15, 2019 - 1:58:26 PM
Last modification on : Wednesday, October 16, 2019 - 1:31:20 AM

Identifiers

Collections

CNRS | ROSAS | UTT

Citation

Mireille Tohmé, Régis Lengellé. F-SVC: A simple and fast training algorithm soft margin Support Vector Classification. 2008 IEEE Workshop on Machine Learning for Signal Processing (MLSP) (Formerly known as NNSP), Oct 2008, Cancun, Mexico. pp.339-344, ⟨10.1109/MLSP.2008.4685503⟩. ⟨hal-02316560⟩

Share

Metrics

Record views

22