Central double cross-validation for estimating parameters in regression models

The ridge regression, lasso, elastic net, forward stagewise regression and the least angle regression require a solution path and tuning parameter, λ, to estimate the coefficient vector. Therefore, it is crucial to find the ideal λ. Cross-validation (CV) is the most widely utilized method for choosi...

Full description

Saved in:
Bibliographic Details
Main Author: Chye, Rou Shi
Format: Thesis
Language:English
Published: 2016
Subjects:
Online Access:http://eprints.utm.my/id/eprint/80959/2/ChyeRouShiMFS2016.pdf
http://eprints.utm.my/id/eprint/80959/
http://dms.library.utm.my:8080/vital/access/manager/Repository/vital:120286
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The ridge regression, lasso, elastic net, forward stagewise regression and the least angle regression require a solution path and tuning parameter, λ, to estimate the coefficient vector. Therefore, it is crucial to find the ideal λ. Cross-validation (CV) is the most widely utilized method for choosing the ideal tuning parameter from the solution path. CV is essentially the breaking down of the original sample into two parts. One part is used to develop the regression equation. The regression equation is then applied to the other part to evaluate the risk of every model. Consequently, the final model is the model with smallest estimated risk. However, CV does not provide consistent results because it has overfitting and underfitting effects during the model selection. In the present study, a new method for estimating parameter in best-subset regression called central double cross-validation (CDCV) is proposed. In this method, the CV is run twice with different number of folds. Therefore, CDCV maximizes the usage of available data, enhances the model selection performance and builds a new stable CV curve. The final model with an error of less than ?? standard error above the smallest CV error is chosen. The CDCV was compared to existing CV methods in determining the correct model via a simulation study with different sample size and correlation settings. Simulation study indicates that the proposed CDCV method has the highest percentage of obtaining the right model and the lowest Bayesian information criterion (BIC) value across multiple simulated study settings. The results showed that, CDCV has the ability to select the right model correctly and prevent the model from underfitting and overfitting. Therefore, CDCV is recommended as a good alternative to the existing methods in the simulation settings.