Keywords

Data Science Support Vector Machines Optimization Statistics

Abstract

One-class classification has emerged as a powerful technique for data description, enabling a model to learn exclusively from data belonging to a single (target) class. By focusing solely on patterns within this class, this classification procedure is implemented in order to successfully assess whether incoming input deviates or is part of the target group. This approach develops a sophisticated understanding of the defining characteristics of the target data in relation to other groups. Consequently, any data point that deviates from the target group is flagged as an anomaly because it diverges from the learned distribution. One way to explore the implementation of one-class classification is through the use of Least Squares Support Vector Data Description (LS-SVDD).

In this study, we apply the Least Squares Support Vector Data Description (LS-SVDD) framework by leveraging gradient descent methods with adaptive learning rates to solve the associated optimization problem. We investigate a range of adaptive optimization strategies to simultaneously estimate the center and radius of the hypersphere that encloses the target data. Our approach utilizes a closed-form solution for the radius, which is iteratively updated in tandem with the center. Emphasis is placed on the final cost function value, the optimized center and radius, and the execution times as key performance metrics. The results demonstrate that adaptive learning rate methods not only yield competitive accuracy but also achieve significantly faster convergence compared to traditional gradient descent, highlighting their critical role in modern machine learning applications.

Completion Date

2025

Semester

Spring

Committee Chair

Maboudou, Edgard

Degree

Master of Science (M.S.)

College

College of Sciences

Department

Statistics and Data Science

Identifier

DP0029347

Document Type

Dissertation/Thesis

Campus Location

Orlando (Main) Campus

Share

COinS