2025 Volume 29 Issue 4 Pages 87-90
Federated Learning (FL) enables distributed learning by sharing model parameters while preserving data privacy. While it proves effective for single tasks, its application to multiple tasks remains challenging. This study proposes Adaptive Tuning of Layer Sharing (ATLAS), a method that dynamically selects shared layers and optimizes task weighting based on task similarity. ATLAS improves the efficiency and accuracy of multi-task FL while reducing communication costs.