IEICE Communications Express
Online ISSN : 2187-0136
ISSN-L : 2187-0136
Special Cluster on Advanced Communication Technologies in Conjunction with Main Topics of ICETC2020
A novel distributed deep learning training scheme based on distributed skip mesh list
Masaya SuzukiKimihiro Mizutani
Author information
JOURNAL FREE ACCESS

2021 Volume 10 Issue 8 Pages 463-468

Details
Abstract

Distributed large scale neural networks are widely used in complicated image recognition and natural language processing among several organizations, but it takes much maintenance cost for keeping the nodes (i.e, computation servers) performance and their network topology in a churn environment that nodes insertion/deletion occurs frequency. To reduce the cost in the environment, we proposed Distributed Skip Mesh List Architecture, which can provide high stability against node insertion/deletion and automatic node management to a distributed large scale neural network management. In the evaluation, we confirmed that it reduces the maintenance cost (e.g., transmission messages for managing nodes) with high stability.

Content from these authors
© 2021 The Institute of Electronics, Information and Communication Engineers
Previous article Next article
feedback
Top