Abstract
Entropy was first introduced in 1865 by Rudolf Clausius in his study of the connection between work and heat. A mathematical definition was given by Boltzmann as the logarithm of the number of micro states that corresponds to a macro state. It plays important roles in statistical mechanics, in the theory of large deviations in probability, as an invariant in ergodic theory and as a useful tool in communication theory. This article explores some of the connections between these different contexts.