Abstract
The notion of spectral relative entropy rate is defined for jointly stationary Gaussian processes. Using classical information-theoretic results, we establish a remarkable connection between time and spectral domain relative entropy rates, and therefore with a multivariate version of the classical Itakura-Saito divergence. This information-theoretic result appears promising for applications where spectral entropy already plays an important role such as EEG analysis. It also lends support to a new spectral estimation technique recently developed by the authors where a multivariate version of the Itakura-Saito distance is employed as a spectrum divergence. A minimum complexity spectrum is provided by this new approach. Simulations suggest the effectiveness of the new technique in tackling multivariate spectral estimation tasks, especially in the case of short data records.