2017 Volume 47 Issue 1 Pages 1-18
Parametric inference based on empirically estimable divergence is becoming popular. This paper focuses on the divergence with affine invariance as well as empirical estimability. Under some assumptions, we can show that the divergence with these two properties is essentially equivalent to the Hölder divergence. We discuss an intersection between Hölder divergence and well-known Bregman divergence, which becomes a simple divergence with only two parameters. We also consider the parametric inference based on Hölder divergence using an enlarged model. This idea enables us to estimate the outlier ratio as well as the model parameter. In addition, the resulting parameter estimate is based on the γ-divergence, so that the parameter estimate has various useful properties, because the γ-divergence is known as a very useful divergence in robust statistics.