抄録
While we have several statistics to select prediction variables in parametric regression analyses such as ordinal multiple regression, we have no suitable statistics in nonparametric regression analyses such as generalized additive model. Thus, at first, the authors took notice of AIC, Cross Validation (CV), and Generalized Cross Validation (GCV) methods. AIC is the criterion based on the maximum likelihood function and the entropy theorem of information. Although GCV is created to remove the leverage which is inevitable in Cross Validation method, it is very hard to compute. Therefore we devised some transformations of CV so as to hold the concept of GCV and get the simplicity of CV. We adapted these statistics to analyze some artificial data, and compared the ability of them for selecting the prediction variables. GCV gave a little better results than the others. On the contrary to our expectation, GCV was not always able to give stable results. It shows us the difficulty of prediction test in nonparametric regression analyses. The authors will continue to examine and improve them using other artificial data.