Interdisciplinary Information Sciences
Online ISSN : 1347-6157
Print ISSN : 1340-9050
ISSN-L : 1340-9050
GSIS SELECTED LECTURES: Exploring Collaborative Mathematics
An Introduction to Maximum Likelihood Estimation and Information Geometry
Keiji MIURA
Author information
JOURNAL FREE ACCESS

2011 Volume 17 Issue 3 Pages 155-174

Details
Abstract

In this paper, we review the maximum likelihood method for estimating the statistical parameters which specify a probabilistic model and show that it generally gives an optimal estimator with minimum mean square error asymptotically. Thus, for most applications in information sciences, the maximum likelihood estimation suffices. Fisher information matrix, which defines the orthogonality between parameters in a probabilistic model, naturally arises from the maximum likelihood estimation. As the inverse of the Fisher information matrix gives the covariance matrix for the estimation errors of the parameters, the orthogonalization of the parameters guarantees that the estimates of the parameters distribute independently from each other. The theory of information geometry provides procedures to diagonalize parameters globally or at all parameter values at least for the exponential and mixture families of distributions. The global orthogonalization gives a simplified and better view for statistical inference and, for example, makes it possible to perform a statistical test for each unknown parameter separately. Therefore, for practical applications, a good start is to examine if the probabilistic model under study belongs to these families.

Content from these authors
© 2011 by the Graduate School of Information Sciences (GSIS), Tohoku University

This article is licensed under a Creative Commons [Attribution 4.0 International] license.
https://creativecommons.org/licenses/by/4.0/
Previous article Next article
feedback
Top