Abstract
We propose how to optimize the embedding dimension and the delay time used for local linear prediction, which can be optimized adaptively to the local structure of the embedded state space. In particular, we assumed that the observed timeseries data are sometimes noisy and not long enough. In this case, even if the dynamics generating the time-series data is stable and stationary, it is advantageous to change the embedding parameters dynamically. For this optimization, we applied the bagging algorithm to local linear prediction, and we optimized the parameters so as to minimize the prediction risk estimated by an ensemble of bagging predictors. To confirm its validity, we performed numerical simulations using chaotic time-series data and real financial data.