2021 年 E104.B 巻 2 号 p. 141-148
Diffusion least-mean-square (LMS) is a method to estimate and track an unknown parameter at multiple nodes in a network. When the unknown vector has sparsity, the sparse promoting version of diffusion LMS, which utilizes a sparse regularization term in the cost function, is known to show better convergence performance than that of the original diffusion LMS. This paper proposes a novel choice of the coefficients involved in the updates of sparse diffusion LMS using the idea of message propagation. Moreover, we optimize the proposed coefficients with respect to mean-square-deviation at the steady-state. Simulation results demonstrate that the proposed method outperforms conventional methods in terms of the convergence performance.