Abstract
Estimation of a distance between probability distributions is one of the fundamental challenges in machine learning, because a distance estimator can be used for various purposes such as two-sample testing, change-point detection, and class-balance estimation. In this article, we review recent advances in direct distance approximation that do not involve estimation of probability distributions. More specifically, we cover direct approximators of the Kullback-Leibler distance, the Pearson distance, the relative Pearson distance, and the L^2-distance.