Title
On Extending The Smo Algorithm Sub-Problem
Abstract
The Support Vector Machine is a widely employed machine learning model due to its repeatedly demonstrated superior generalization performance. The Sequential Minimal Optimization (SMO) algorithm is one of the most popular SVM training approaches. SMO is fast, as well as easy to implement; however, it has a limited working set size (2 points only). Faster training times can result if the working set size can be increased without significantly increasing the computational complexity. In this paper, we extend the 2-point SMO formulation to a 4-point formulation and address the theoretical issues associated with such an extension. We show that modifying the SMO algorithm to increase the working set size is beneficial in terms of the number of iterations required for convergence, and shows promise for reducing the overall training time. ©2007 IEEE.
Publication Date
12-1-2007
Publication Title
IEEE International Conference on Neural Networks - Conference Proceedings
Number of Pages
886-891
Document Type
Article; Proceedings Paper
Personal Identifier
scopus
DOI Link
https://doi.org/10.1109/IJCNN.2007.4371075
Copyright Status
Unknown
Socpus ID
51749089036 (Scopus)
Source API URL
https://api.elsevier.com/content/abstract/scopus_id/51749089036
STARS Citation
Sentelle, Christopher; Georgiopoulos, Michael; Anagnostopoulos, Georgios C.; and Young, Cynthia, "On Extending The Smo Algorithm Sub-Problem" (2007). Scopus Export 2000s. 6062.
https://stars.library.ucf.edu/scopus2000/6062