2025 Volume 54 Issue 2 Pages 177-203
A non-normalized model is a statistical model defined by an unnormalized density, i.e., a density function that does not integrate to one. In machine learning, such models are often referred to as energy-based models. Examples include Markov random fields, distributions on manifolds, and Boltzmann machines. These models allow for flexible data modeling but present challenges for likelihood-based statistical inference due to the presence of an intractable normalization constant. To address this issue, various statistical inference methods that do not require explicit computation of the normalization constant have been developed. In this paper, we introduce two parameter estimation methods for non-normalized models: score matching and noise contrastive estimation. We also discuss recent advancements, such as information criteria and nonlinear independent component analysis, as well as connections to other statistical methods, including shrinkage estimation and bridge sampling.