2025 Volume E108.A Issue 3 Pages 405-413
H-mutual information (H-MI) is a wide class of information leakage measures, where H = (η, F) is a pair of monotonically increasing function η and a concave function F, which is a generalization of Shannon entropy. H-MI is defined as the difference between the generalized entropy H and its conditional version, including Shannon mutual information (MI), Arimoto MI of order α, g-leakage, and expected value of sample information. This study presents a variational characterization of H-MI via statistical decision theory. Based on the characterization, we propose an alternating optimization algorithm for computing H-capacity.