2022 Volume 30 Pages 209-225
Achieving differential privacy and utilizing secure multiparty computation are the two primary approaches used for ensuring privacy in privacy-preserving machine learning. However, the privacy guarantee by existing integration protocols of both approaches for collaborative learning weakens when more participants join the protocols. In this work, we present Secure and Private Gradient Computation (SPGC), a novel collaborative learning framework with a strong privacy guarantee independent of the number of participants while still providing high accuracy. The main idea of SPGC is to create noise for the differential privacy within secure multiparty computation. We also created an implementation of SPGC and used it in experiments to measure its accuracy and training time. The results show that SPGC is more accurate than a naive protocol based on local differential privacy by up to 5.6%. We experimentally show that the training time increases in proportion to the noise generation and then demonstrate that the privacy guarantee is independent of the number of participants as well as the accuracy evaluation.