2025 Volume E108.A Issue 3 Pages 313-322
Gaussian Process (GP) has been acknowledged as a powerful kernel-based machine learning technique with broad application areas, such as time series prediction and system state estimation. However, in the era of big data, new challenges are raised for GP. For example, in the presence of huge amount of data distributed at different locations, how to perform GP without being faced with significant privacy concerns? In this paper, we are aiming at constructing a distributed and secured GP learning framework over networks. Specifically, we first propose the idea of secured GP by incorporating random unitary transform, such that locally, the processing of data is guaranteed to be secure. Then, noticing that gathering data to a central node for GP learning is neither efficient nor secure, we extend secured GP into distributed learning over networks through invoking Alternating Direction Method of Multipliers (ADMM) technique, such that global optimality can be asymptotically reached only with local computations and parameter exchange. Finally, we demonstrate the performance improvements through simulation.