IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
Online ISSN : 1745-1337
Print ISSN : 0916-8508

This article has now been updated. Please use the final version.

A Variational Characterization of H-Mutual Information and its Application to Computing H-Capacity
Akira KAMATSUKAKoki KAZAMATakahiro YOSHIDA
Author information
JOURNAL FREE ACCESS Advance online publication

Article ID: 2024TAP0010

Details
Abstract

H-mutual information (H-MI) is a wide class of information leakage measures, where H = (η, F) is a pair of monotonically increasing function η and a concave function F, which is a generalization of Shannon entropy. H-MI is defined as the difference between the generalized entropy H and its conditional version, including Shannon mutual information (MI), Arimoto MI of order α, g-leakage, and expected value of sample information. This study presents a variational characterization of H-MI via statistical decision theory. Based on the characterization, we propose an alternating optimization algorithm for computing H-capacity.

Content from these authors
© 2024 The Institute of Electronics, Information and Communication Engineers
feedback
Top