Kernel Methods For Changes Detection In Covariance Matrices

Keywords

Matrix outlier detection; Matrix variates; Mercer kernel; One-class classification; Power-Euclidean kernel; Support matrices

Abstract

Several methods have been proposed to solve the one-class classification problem for vectors. Three methods are mainly used: density estimation, boundary methods, and reconstruction methods. The focus here is on boundary methods. These methods include the k-center method, the nearest neighbor method, one-class support vector machine (OCSVM), and the support vector data description (SVDD). In industrial applications, like statistical process control (SPC), practitioners successfully used SVDD to detect anomalies or outliers in the process. However, when a multivariate process shifts, it occurs in either location or scale. This far, most of the research effort, like the OCSVM or SVDD, has focused on location (vectors or mean vectors). Several methods have been proposed recently to monitor the scale, i.e., the covariance matrix. Most of these methods deal with a full rank covariance matrix, i.e., a situation where the number of rational subgroups is larger than the number of variables. When the number of variables is nearly as large as, or larger than, the number of observations, most Shewhart-type charts are unable to solve this problem as the estimated covariance matrix is not full rank. This work will extend the one-class classification method using kernels to detect changes in the covariance matrix when the number of observations available is less than the number of variables.

Publication Date

7-3-2018

Publication Title

Communications in Statistics: Simulation and Computation

Volume

47

Issue

6

Number of Pages

1704-1721

Document Type

Article

Personal Identifier

scopus

DOI Link

https://doi.org/10.1080/03610918.2017.1322701

Socpus ID

85021836808 (Scopus)

Source API URL

https://api.elsevier.com/content/abstract/scopus_id/85021836808

This document is currently not available here.

Share

COinS