応用数理
Online ISSN : 2432-1982
拡張型確率的コンプレキシティと情報論的学習理論
山西 健司
著者情報
ジャーナル フリー

1998 年 8 巻 3 号 p. 188-203

詳細
抄録

Rissanen has introduced stochastic complexity to define the amount of information in a given data sequence relative to a given hypothesis class of probability densities, where the information is measured in terms of a logarithmic loss associated with universal data compression. This paper proposes the notion of extended stochastic complerxity (ESC) by generalizing Rissanen's stochastic complexity into the decision-theoretic setting where a general real-valued function is used as a hypothesis and a general loss function is used as a distortion measure. We thereby demonstrate the effectiveness of ESC in design and analysis of learning algorithms in sequential prediction and function-estimation scenarios. As an application of ESC to sequential prediction, this paper shows that a sequential realization of ESC produces a sequential prediction algorithm called the aggregating strategy, for which the worst-case cumulative prediction loss is asymptotically minimal. As an application of ESC to function-estimation, this paper shows that a batch-approximation of ESC induces a batchlearning algorithm called the minimum L-complexity algorithm (MLC), for which an upper bound on the statistical risk is least to date. Through ESC we give a unifying view of designing the most effective learning algorithms in fundamental learning issues.

著者関連情報
© 1998 一般社団法人 日本応用数理学会
前の記事 次の記事
feedback
Top