IEICE Communications Express
Online ISSN : 2187-0136
ISSN-L : 2187-0136

この記事には本公開記事があります。本公開記事を参照してください。
引用する場合も本公開記事を引用してください。

A Novel Distributed Deep Learning Training Scheme Based on Distributed Skip Mesh List
Masaya SuzukiKimihiro Mizutani
著者情報
ジャーナル フリー 早期公開

論文ID: 2021ETL0023

この記事には本公開記事があります。
詳細
抄録

Distributed large scale neural networks are widely used in complicated image recognition and natural language processing among several organizations, but it takes much maintenance cost for keeping the nodes (i.e, computation servers) performance and their network topology in a churn environment that nodes insertion/deletion occurs frequency. To reduce the cost in the environment, we proposed Distributed Skip Mesh List Architecture, which can provide high stability against node insertion/deletion and automatic node management to a distributed large scale neural network management. In the evaluation, we confirmed that it reduces the maintenance cost (e.g., transmission messages for managing nodes) with high stability.

著者関連情報
© 2021 The Institute of Electronics, Information and Communication Engineers
feedback
Top