Nonlinear Theory and Its Applications, IEICE
Online ISSN : 2185-4106
ISSN-L : 2185-4106
Volume 14, Issue 4
Displaying 1-5 of 5 articles from this issue
Special Section on Emerging Technologies of Complex Communication Science
  • Masaki Aida
    Article type: FOREWORD
    2023 Volume 14 Issue 4 Pages 638
    Published: 2023
    Released on J-STAGE: October 01, 2023
    JOURNAL OPEN ACCESS
    Download PDF (88K)
  • Tatsuya Kaneko, Hiroshi Momose, Hitoshi Suwa, Takashi Ono, Yuriko Haya ...
    Article type: Paper
    2023 Volume 14 Issue 4 Pages 639-651
    Published: 2023
    Released on J-STAGE: October 01, 2023
    JOURNAL OPEN ACCESS

    Computing-in-memory (CIM) devices have attracted attention because of their high operation efficiency in edge AI, which requires low power operation. This paper proposed a digital circuit architecture controlling the inference and learning of CIM devices such as the RAND chip, which utilizes the non-linearity of ReRAM as memory elements. The RAND chip is used as the CIM device for inference and as external memory for training. The system performance in the XOR identification test achieves the same convergence as the software implementation of the learning core. The proposed learning core achieved efficiency of 7.77 GOPS/W, thereby verifying the effectiveness of the proposed architecture for on-line CIM device learning.

    Download PDF (1329K)
  • Saki Okamoto, Kenya Jin'no
    Article type: Paper
    2023 Volume 14 Issue 4 Pages 652-676
    Published: 2023
    Released on J-STAGE: October 01, 2023
    JOURNAL OPEN ACCESS

    An encoder-decoder model consists of an encoder that encodes the input into a low-dimensional latent variable and a decoder that decodes the obtained latent variable to the same dimension as the input. The encoder-decoder model performs representation learning to automatically extract features of the data, but the model is a black box and it is not clear what features are extracted. We focused on whether including a skip connection between the encoder and decoder increased accuracy. It is generally believed that this skip connection plays a role in conveying high-resolution information. However, its actual role remains unclear. In this study, we focused on this concatenation. We experimentally clarified the role of the latent variables conveyed by this concatenation when the images given to the input and output were the same or different during training.

    Download PDF (19895K)
  • Yota Tsukamoto, Honami Tsushima, Tohru Ikeguchi
    Article type: Paper
    2023 Volume 14 Issue 4 Pages 677-690
    Published: 2023
    Released on J-STAGE: October 01, 2023
    JOURNAL OPEN ACCESS

    The Izhikevich neuron model can reproduce various types of neurons, including chaotic neurons, by utilizing appropriate parameter sets. This study analyzes the responses of a periodically forced Izhikevich neuron with chaotic parameters using three measures—the diversity index, the coefficient of variation, and the local variation—to quantify interspike intervals (ISIs). The evaluation of ISIs combining these three measures clarifies the differences in neuronal activities, but evaluation using an individual measure cannot. In addition, we analyzed the change in the stability of the equilibrium points caused by a periodic input on a phase plane. The results indicate that in electrophysiologically feasible parameter sets, the stability of equilibrium points plays a crucial role in determining the critical amplitude around which irregular activities transition to regular ones. Thus, the relationship between neural behavior and the period and amplitude of the input current is contingent upon the existence and stability of equilibrium points.

    Download PDF (9247K)
Regular Section
  • Tsuyoshi Ishizone, Tomoyuki Higuchi, Kazuyuki Nakamura
    Article type: Paper
    2023 Volume 14 Issue 4 Pages 691-717
    Published: 2023
    Released on J-STAGE: October 01, 2023
    JOURNAL OPEN ACCESS

    Time series model inference can be divided into modeling and optimization. Sequential VAEs have been studied as a modeling technique. As an optimization technique, methods combining variational inference (VI) and sequential Monte Carlo (SMC) have been proposed; however, they have two drawbacks: less particle diversity and biased gradient estimators. This paper proposes Ensemble Kalman Variational Objective (EnKO), a VI framework with the ensemble Kalman filter, to infer latent time-series models. Our proposed method efficiently learns the time-series models because of its particle diversity and unbiased gradient estimators. We demonstrate that our EnKO outperforms previous SMC-based VI methods in the predictive ability for several synthetic and real-world data sets.

    Download PDF (3275K)
feedback
Top