Bias and variance reduction in estimation of model dimension
HTML articles powered by AMS MathViewer
- by Wei-Yin Loh and Xiaodong Zheng PDF
- Proc. Amer. Math. Soc. 122 (1994), 1263-1272 Request permission
Abstract:
The problem of estimating the number of regressors to include in a linear regression model is considered. Estimators based on the final prediction error and Akaike’s criterion frequently have large positive bias. Shrinkage correction factors and bootstrapping are used to produce new estimators with reduced bias. The asymptotic bias and mean-squared errors of these estimators are derived analytically. Finite-sample estimates are obtained by simulation.References
- Prabir Burman, A comparative study of ordinary cross-validation, $v$-fold cross-validation and the repeated learning-testing methods, Biometrika 76 (1989), no. 3, 503–514. MR 1040644, DOI 10.1093/biomet/76.3.503
- Yuan Shih Chow and Henry Teicher, Probability theory, 2nd ed., Springer Texts in Statistics, Springer-Verlag, New York, 1988. Independence, interchangeability, martingales. MR 953964, DOI 10.1007/978-1-4684-0504-0
- G. A. F. Seber, Linear regression analysis, Wiley Series in Probability and Mathematical Statistics, John Wiley & Sons, New York-London-Sydney, 1977. MR 0436482
- Jun Shao, Linear model selection by cross-validation, J. Amer. Statist. Assoc. 88 (1993), no. 422, 486–494. MR 1224373
- Frank Spitzer, A combinatorial lemma and its application to probability theory, Trans. Amer. Math. Soc. 82 (1956), 323–339. MR 79851, DOI 10.1090/S0002-9947-1956-0079851-X
- Ping Zhang, On the distributional properties of model selection criteria, J. Amer. Statist. Assoc. 87 (1992), no. 419, 732–737. MR 1185195
Additional Information
- © Copyright 1994 American Mathematical Society
- Journal: Proc. Amer. Math. Soc. 122 (1994), 1263-1272
- MSC: Primary 62J05; Secondary 62F11
- DOI: https://doi.org/10.1090/S0002-9939-1994-1211583-3
- MathSciNet review: 1211583