Abstract
This paper considers a linear regression model with possible multicollinearity.
When the matrix AtA is nearly singular, the least squares estimator (LSE) gets
unstable. Typical solutions for this problem include the generalized ridge estimator
due to Hoerl and Kennard(1970a,b) and its derivatives. Among them, we focus on
an adaptive ridge estimator discussed by Wang and Chow(1990) under normality. We
assume the error term e is distributied as a spherically symmetric distributiuon and
derive a sufficient condition so that the estimator is superior to the LSE under mean
squared error (MSE) and quadratic loss. Several numerical examples are also given.