2026 年 17 巻 2 号 p. 528-548
Federated multi-task learning on edge devices faces prohibitive communication and computational costs from dense neural networks. We propose a framework that overcomes this by integrating a sparse hidden neural network, inspired by the lottery ticket hypothesis, with ATLAS, a dynamic sharing algorithm based on the common bases hypothesis (CBH). Our method achieves accuracy comparable to dense models and reveals a clear collaborative advantage in challenging non-IID settings, surpassing isolated training. The learned common bases also act as powerful feature extractors to accelerate few-shot transfer learning, validating CBH for sparse networks and enabling efficient collaborative learning.