Sufficient dimension reduction via inverse regression: A minimum discrepancy approach
Abbreviated Journal Title
J. Am. Stat. Assoc.
inverse regression estimator; sliced average variance estimation; sliced; inverse regression; sufficient dimension reduction; PRINCIPAL HESSIAN DIRECTIONS; LEAST-SQUARES; CATEGORICAL PREDICTORS; STRUCTURAL DIMENSION; ASYMPTOTIC THEORY; VISUALIZATION; ALGORITHMS; MODELS; Statistics & Probability
A family of dimension-reduction methods, the inverse regression (IR) family, is developed by minimizing a quadratic objective function. An optimal member of this family, the inverse regression estimator (IRE), is proposed, along with inference methods and a computational algorithm. The IRE has at least three desirable properties: (1) Its estimated basis of the central dimension reduction subspace is asymptotically efficient, (2) its test statistic for dimension has an asymptotic chi-squared distribution, and (3) it provides a chi-squared test of the conditional independence hypothesis that the response is independent of a selected subset of predictors given the remaining predictors. Current methods like sliced inverse regression belong to a suboptimal class of the IR family. Comparisons of these methods are reported through simulation studies. The approach developed here also allows a relatively straightforward derivation of the asymptotic null distribution of the test statistic for dimension used in sliced average variance estimation.
Journal of the American Statistical Association
"Sufficient dimension reduction via inverse regression: A minimum discrepancy approach" (2005). Faculty Bibliography 2000s. 5086.