Ouyou toukeigaku
Online ISSN : 1883-8081
Print ISSN : 0285-0370
ISSN-L : 0285-0370
Volume 33, Issue 1
Displaying 1-4 of 4 articles from this issue
  • Masanori Ito, Masashi Goto
    2004Volume 33Issue 1 Pages 3-26
    Published: August 25, 2004
    Released on J-STAGE: June 12, 2009
    JOURNAL FREE ACCESS
    In this paper, we introduce a Nonparametric Transform-Both-sides (NTB) approach as an alternative to the Power Transform-Both-sides (PTB) approach to inference for theoretical models and propose a method of parameter estimation by expressing the function transformation as a cubic spline curve. From the investigation of two examples, we suggest that the NTB could be an index for the validation of the PTB and is more robust than PTB to outliers. Furthermore, we verify these results by three simulation experiments. In the methodology for fitting the empirical model, we introduce Alternating Conditional Expectation (ACE) and Additivity VAriance Stabilization (AVAS) as two nonparametric transformation approaches that optimize the relationship between response and explanatory variables. We examine the validity of the theoretical models by fitting empirical models via ACE and AVAS to the example data. Both method, ACE and AVAS, improve the normality and homoscedasticity of the error.
    Download PDF (1875K)
  • Wataru Sakamoto
    2004Volume 33Issue 1 Pages 27-49
    Published: August 25, 2004
    Released on J-STAGE: June 12, 2009
    JOURNAL FREE ACCESS
    In nonparametric regression models, the requirements of homoscedasticity and so on are implicitly assumed, which leads to poor estimation of regression functions. A power weighted smoothing spline (PWSS) model, whose objective is to diagnose homoscedasticity as well as to estimate unknown nonlinear regression structure, is assumed. The responses in an additive regression model are power-transformed, and then their variances after transformation are assumed to be constant. Smoothing splines are obtained as estimated functions by maximizing the penalized likelihood, and a reweighted version of a backfitting algorithm is constructed. A power-transformation parameter and smoothing parameters, which control smoothness of the functions, are estimated by maximizing the marginal likelihood, based on Bayesian approaches to smoothing splines. A form of the marginal likelihood, which yields comparatively easy computation, is derived using the property that smoothing splines are the best linear unbiased predictor of a linear mixed model. Examination of some data sets from the literature and a simulation experiment show that the power transformation estimated with the PWSS model attains homoscedasticity while taking nonlinear structure into account.
    Download PDF (952K)
  • Hirokazu Yanagihara, Megu Ohtaki
    2004Volume 33Issue 1 Pages 51-69
    Published: August 25, 2004
    Released on J-STAGE: June 12, 2009
    JOURNAL FREE ACCESS
    The B-spline method for scatterplot smoothing has the advantages of simplicity and time-economization in its computation over other nonparametric smoothing methods such as the kernel method, whereas it has the disadvantage of instability due to over-fitting data with a complicated trend because of the excess flexibility. In this paper, we modify this demerit of the B-spline method by introducing an information criterion for optimization of the number of knots and the smoothing parameter to avoid over-fitting.
    Download PDF (706K)
  • Satoshi Miyata
    2004Volume 33Issue 1 Pages 71-91
    Published: August 25, 2004
    Released on J-STAGE: June 12, 2009
    JOURNAL FREE ACCESS
    In modeling procedures for non-parametric regression, model selection via optimization of the model tuning parameters becomes an important problem. This article discusses optimal model selection with a data-adaptive penalty, and optimization using the evolutionary algorithm (EA). Most model selection criteria, like AIC, BIC and Mallows's Cp, utilize a fixed penalty to control the complexity of the models. These model selection procedures are non-adaptive and their performance depends on the required complexity of the model. For instance, model selection criteria with "large" penalty perform well only for "simple" models and vice versa. To avoid selection bias of this kind, we adopt the adaptive model selection criterion (AMSC) with a data-adaptive penalty, which is defined as the best estimator of the relative squared loss between the true model and the estimator, and is applicable to various degree of model structure complexity. In general, optimization for model selection is a complex non-linear problem. In most conventional methodologies, deterministic procedures, like grid search or the stepwise technique, were adopted and only suboptimal solutions were obtained. In this paper, we consider global optimization of the AMSC via stochastic optimization procedures (EA, for example). The proposed procedure is applied to 1) a hierarchical neural network, i) a single layer feed-forward neural network, ii) a radial basis functions network, and 2) a Support Vector Machine, and in modeling procedures for non-parametric regression, model selection via optimization of the model tuning parameters becomes an important problem. This article discusses optimal model selection with a data-adaptive penalty, and optimization using the evolutionary algorithm (EA). Most model selection criteria, like AIC, BIC and Mallows's Cp, utilize a fixed penalty to control the complexity of the models. These model selection procedures are non-adaptive and their performance depends on the required complexity of the model. For instance, model selection criteria with "large" penalty perform well only for "simple" models and vice versa. To avoid selection bias of this kind, we adopt the adaptive model selection criterion (AMSC) with a data-adaptive penalty, which is defined as the best estimator of the relative squared loss between the true model and the estimator, and is applicable to various degree of model structure complexity. In general, optimization for model selection is a complex non-linear problem. In most conventional methodologies, deterministic procedures, like grid search or the stepwise technique, were adopted and only sub-optimal solutions were obtained. In this paper, we consider global optimization of the AMSC via stochastic optimization procedures (EA, for example). The proposed procedure is applied to 1) a hierarchical neural network, i) a single layer feed-forward neural network, ii) a radial basis functions network, and 2) a Support Vector Machine, and its effectiveness is confirmed via simulation studies.
    Download PDF (930K)
feedback
Top