Decision theory classification of high-dimensional vectors based on small samples
Abbreviated Journal Title
support vector machine; decision theory; posterior probabilities; matrix-variate normal distribution; MACHINES; RULE; Statistics & Probability
In this paper, an entirely new procedure for the classification of high-dimensional vectors on the basis of a few training samples is described. The proposed method is based on the Bayesian paradigm and provides posterior probabilities that a new vector belongs to each of the classes, therefore it adapts naturally to any number of classes. The classification technique is based on a small vector which can be viewed as a regression of the new observation onto the space spanned by the training samples, which is similar to Support Vector Machine classification paradigm. This is achieved by employing matrix-variate distributions in classification, which is an entirely new idea.
"Decision theory classification of high-dimensional vectors based on small samples" (2008). Faculty Bibliography 2000s. 146.