Abstract
Information Geometry has been advocated by Shun-ichi Amari in early 1980s, which provides
a geometric insight for understanding statistical ideas such as information, sufficiency and efficiency. Furthermore,
it is closely connected with almost all of areas in mathematical sciences including information,
statistical, physical, biological and brain sciences. One of the most characteristic of information geometry is
in a dualistic pair of e-connection and m-connection in the space of all probability density functions, which
can be viewed as a Riemannian space with a metric tensor derived by the Fisher information matrix. Surprisingly,
the essential theorem can reduce to the Pythagorean theorem developed in the ancient Greek era,
which provides a view for the interplay between a statistical model and estimation expanding a Pythagorean
foliation in the probability density function space. Finally, we have a review for the present address and
future directions in information geometry.