Authors

F. Abramovich; V. Grinshtein;M. Pensky

Comments

Authors: contact us about adding a copy of your work at STARS@ucf.edu

Abbreviated Journal Title

Ann. Stat.

Keywords

adaptivity; complexity penalty; maximum a posteriori rule; minimax; estimation; sequence estimation; sparsity; thresholding; FALSE DISCOVERY RATE; INFLATION CRITERION; VARIABLE SELECTION; REGRESSION; SHRINKAGE; MODEL; RISK; Statistics & Probability

Abstract

We consider a problem of recovering a high-dimensional vector mu observed in white noise, where the unknown vector g is assumed to be sparse. The objective of the paper is to develop a Bayesian formalism which gives rise to a family of l(0)-type penalties. The penalties are associated with various choices of the prior distributions pi(n)(center dot) on the number of nonzero entries of mu and, hence, are easy to interpret. The resulting Bayesian estimators lead to a general thresholding rule which accommodates many of the known thresholding and model selection procedures as particular cases corresponding to specific choices of pi(n)(center dot). Furthermore, they achieve optimality in a rather general setting under very mild conditions on the prior. We also specify the class of priors pi(n)(center dot) for which the resulting estimator is adaptively optimal (in the minimax sense) for a wide range of sparse sequences and consider several examples of such priors.

Journal Title

Annals of Statistics

Volume

35

Issue/Number

5

Publication Date

1-1-2007

Document Type

Article

Language

English

First Page

2261

Last Page

2286

WOS Identifier

WOS:000251096100017

ISSN

0090-5364

Share

COinS