Title

An Efficient Active Set Method For Svm Training Without Singular Inner Problems

Abstract

Efficiently implemented active set methods have been successfully applied to Support Vector Machine (SVM) training. These active set methods offer higher precision and incremental training at the cost of additional memory requirements when compared to decomposition methods such as Sequential Minimal Optimization (SMO). However, all existing active set methods must deal with singularities occurring within the inner problem solved at each iteration, a problem that leads to more complex implementation and potential inefficiencies. Here, we introduce a revised simplex method, originally introduced by Rusin, adapted for SVM training and show this is an active set method similar to most existing methods with the advantage of maintaining nonsingularity of the inner problem. We compare performance to an existing active set method introduced by Scheinberg and demonstrate an improvement in training times, in some cases. We show our method maintains a slightly simpler implementation and offers advantages in terms of applying iterative methods to alleviate memory concerns. We also show performance of the active set methods when compared to state-of-theart decomposition implementations such as SVMLight and SMO. © 2009 IEEE.

Publication Date

11-18-2009

Publication Title

Proceedings of the International Joint Conference on Neural Networks

Number of Pages

2875-2882

Document Type

Article; Proceedings Paper

Personal Identifier

scopus

DOI Link

https://doi.org/10.1109/IJCNN.2009.5178948

Socpus ID

70449395243 (Scopus)

Source API URL

https://api.elsevier.com/content/abstract/scopus_id/70449395243

This document is currently not available here.

Share

COinS