Nonlinear Theory and Its Applications, IEICE
Online ISSN : 2185-4106
ISSN-L : 2185-4106
Special Section on Recent Advances in Nonlinear Problems
Common Bases Hypothesis: exploring multi-task collaborative learning of neural networks
Fumiya AraiAtsushi HoriTakao MarukameTetsuya AsaiKota Ando
Author information
JOURNAL OPEN ACCESS

2025 Volume 16 Issue 1 Pages 79-95

Details
Abstract

Edge computing methods, particularly federated learning (FL), have gained significant attention because of their ability to share contributions to model training without exposing the training data to other users. Although conventional FL focuses on a single-task data distribution, it is unclear whether similar learning can be achieved for different tasks. We explore a learning architecture that supports multitask learning through partial model sharing. To demonstrate its effectiveness, we propose the common bases hypothesis, suggesting that the efficient sharing of common representations among tasks is possible. Using singular value decomposition to fix a subspace of the trained weights during learning, our results indicate that FL with partial model sharing for multiple tasks is feasible because of its ability to learn similar representations.

Content from these authors
© 2025 The Institute of Electronics, Information and Communication Engineers

This article is licensed under a Creative Commons [Attribution-NonCommercial-NoDerivatives 4.0 International] license.
https://creativecommons.org/licenses/by-nc-nd/4.0/
Previous article Next article
feedback
Top