IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
Online ISSN : 1745-1337
Print ISSN : 0916-8508
Regular Section
Efficient Mini-Batch Training on Memristor Neural Network Integrating Gradient Calculation and Weight Update
Satoshi YAMAMORIMasayuki HIROMOTOTakashi SATO
著者情報
ジャーナル 認証あり

2018 年 E101.A 巻 7 号 p. 1092-1100

詳細
抄録

We propose an efficient training method for memristor neural networks. The proposed method is suitable for the mini-batch-based training, which is a common technique for various neural networks. By integrating the two processes of gradient calculation in the backpropagation algorithm and weight update in the write operation to the memristors, the proposed method accelerates the training process and also eliminates the external computing resources required in the existing method, such as multipliers and memories. Through numerical experiments, we demonstrated that the proposed method achieves twice faster convergence of the training process than the existing method, while retaining the same level of the accuracy for the classification results.

著者関連情報
© 2018 The Institute of Electronics, Information and Communication Engineers
前の記事 次の記事
feedback
Top