Keywords

Lasso, coordinate descent, elastic net, smooth lasso, sparsity, collinearity, high dimensional data, variable selection

Abstract

For a linear regression, the traditional technique deals with a case where the number of observations n more than the number of predictor variables p (n > p). In the case n < p, the classical method fails to estimate the coefficients. A solution of the problem is the case of correlated predictors is provided in this thesis. A new regularization and variable selection is proposed under the name of Sparse Ridge Fusion (SRF). In the case of highly correlated predictor, the simulated examples and a real data show that the SRF always outperforms the lasso, eleastic net, and the S-Lasso, and the results show that the SRF selects more predictor variables than the sample size n while the maximum selected variables by lasso is n size.

Notes

If this is your thesis or dissertation, and want to learn how to access it or for more information about readership statistics, contact us at STARS@ucf.edu

Graduation Date

2013

Semester

Fall

Advisor

Maboudou, Edgard

Degree

Master of Science (M.S.)

College

College of Sciences

Department

Statistics

Degree Program

Statistical Computing

Format

application/pdf

Identifier

CFE0005031

URL

http://purl.fcla.edu/fcla/etd/CFE0005031

Language

English

Release Date

December 2013

Length of Campus-only Access

None

Access Status

Masters Thesis (Open Access)

Subjects

Dissertations, Academic -- Sciences, Sciences -- Dissertations, Academic

Share

COinS