Title

Kernel Principal Subspace Mahalanobis Distances For Outlier Detection

Abstract

Over the last few years, Kernel Principal Component Analysis (KPCA) has found several applications in outlier detection. A relatively recent method uses KPCA to compute the reconstruction error (RE) of previously unseen samples and, via thresholding, to identify atypical samples. In this paper we propose an alternative method, which performs the same task, but considers Mahalanobis distances in the orthogonal complement of the subspace that is utilized to compute the reconstruction error. In order to illustrate its merits, we provide qualitative and quantitative results on both artificial and real datasets and we show that it is competitive, if not superior, for several outlier detection tasks, when compared to the original RE-based variant and the One-Class SVM detection approach. © 2011 IEEE.

Publication Date

10-24-2011

Publication Title

Proceedings of the International Joint Conference on Neural Networks

Number of Pages

2528-2535

Document Type

Article; Proceedings Paper

Personal Identifier

scopus

DOI Link

https://doi.org/10.1109/IJCNN.2011.6033548

Socpus ID

80054719880 (Scopus)

Source API URL

https://api.elsevier.com/content/abstract/scopus_id/80054719880

This document is currently not available here.

Share

COinS