Transactions of the Institute of Systems, Control and Information Engineers
Online ISSN : 2185-811X
Print ISSN : 1342-5668
ISSN-L : 1342-5668
Gradient Descent Learning for Hyperbolic Hopfield Associative Memory
Masayuki TsujiTeijiro IsokawaMasaki KobayashiNobuyuki MatsuiNaotake Kamiura
Author information
JOURNAL FREE ACCESS

2021 Volume 34 Issue 1 Pages 11-22

Details
Abstract

This paper proposes a scheme for embedding patterns onto the Hyperbolic-valued Hopfield Neural Networks (HHNNs). This scheme is based on gradient descent learning (GDL), in which the connection weights among neurons are gradually modified by iterative applications of patterns to be embedded. The performances of the proposed scheme are evaluated though several types of numerical experiments, as compared to projection rule (PR) for HHNNs. Experimental results show that pattern embedding by the proposed GDL is still possible for large number of patterns, in which the embedding by PR often fails. It is also shown that the proposed GDL can be improved, in terms both of stability of embedded patterns and of computational costs, by configuring the initial connection weights by PR and then by modifying the connection weights by GDL.

Content from these authors
© 2021 The Institute of Systems, Control and Information Engineers
Previous article Next article
feedback
Top