Journal of the Japan Statistical Society, Japanese Issue
Online ISSN : 2189-1478
Print ISSN : 0389-5602
ISSN-L : 0389-5602
Special Topic: The JSS Research Prize Lecture
Divergence-based Statistical Inference
Takafumi KanamoriHironori Fujisawa
Author information
JOURNAL FREE ACCESS

2017 Volume 47 Issue 1 Pages 1-18

Details
Abstract

Parametric inference based on empirically estimable divergence is becoming popular. This paper focuses on the divergence with affine invariance as well as empirical estimability. Under some assumptions, we can show that the divergence with these two properties is essentially equivalent to the Hölder divergence. We discuss an intersection between Hölder divergence and well-known Bregman divergence, which becomes a simple divergence with only two parameters. We also consider the parametric inference based on Hölder divergence using an enlarged model. This idea enables us to estimate the outlier ratio as well as the model parameter. In addition, the resulting parameter estimate is based on the γ-divergence, so that the parameter estimate has various useful properties, because the γ-divergence is known as a very useful divergence in robust statistics.

Content from these authors
© 2017 Japan Statistical Society
Next article
feedback
Top