IEICE ESS Fundamentals Review
Online ISSN : 1882-0875
ISSN-L : 1882-0875
Volume 18, Issue 1
Displaying 1-25 of 25 articles from this issue
Cover
Table of Contents
Preface
Origins of Technology
Proposed by Editorial Committee
  • —Fusion of Legacy and Modern Unix Technologies—
    Taiji YAMADA, Jun TAKAHASHI, Yutaka SHIMADA, Tohru IKEGUCHI
    2024Volume 18Issue 1 Pages 7-28
    Published: July 01, 2024
    Released on J-STAGE: July 01, 2024
    JOURNAL FREE ACCESS

    NFS (Network File System) is a distributed file system used in Unix. In the 1990s, when managing and operating computer networks using Unix, it was common that users' home directories were shared through NFS from every machine where they logged into. Such an environment in which users' home directories can be referred to at any time within a local network has various advantages, for example, users on the local network can share technology information with each other and use computer resources in the local network. However, now that storage technologies have evolved from HDDs to SSDs, even if a network with a high transmission speed, e.g., 10 GbE, is used, NFS file sharing itself becomes a bottleneck in terms of access speed. Therefore, it is difficult to bring out the performance of faster machines. Therefore, we developed a network environment in which users' home directories can be shared within a local network without reducing the access speed of SSDs on the console. Specifically, we realized a distributed computer network using mesh-type NFS home sharing in which each machine exports its home directory through NFS on macOS, a modern Unix. This article offers one of the technical solutions to this issue.

    Download PDF (8642K)
Advanced Review Papers
Proposed by IT (Information Theory)
  • Tadashi WADAYAMA
    2024Volume 18Issue 1 Pages 29-41
    Published: July 01, 2024
    Released on J-STAGE: July 01, 2024
    JOURNAL FREE ACCESS

    In this paper, we present an overview of the proximal decoding for LDPC codes proposed by Wadayama and Takabe. The related paper (Wadayama and Takabe, IEICE EA, no.3, pp.359–367, 2023) was awarded the 2022 IEICE Paper Award. The proximal gradient method, widely used in signal processing for solving inverse problems such as sparse signal reconstruction, has been applied to LDPC decoding problems for the first time in this paper. The core idea of the proposed method is to perform approximate maximum a posteriori probability (MAP) decoding by applying the proximal gradient method to iteratively minimize an objective function consisting of the negative log-likelihood corresponding to the communication channel and the code potential energy function corresponding to the code. By appropriately modifying the negative log-likelihood function, we can apply the proposed decoding method to a wide range of communication channels. Computational experiments have shown that the proposed decoding method provides a significant improvement in decoding performance compared with the existing de facto standard method (a combination of MMSE signal detection and belief propagation decoding) for LDPC-coded MIMO communication channels, which are important in modern wireless communication systems. The effectiveness of the proposed method has also been experimentally demonstrated for communication channels with colored Gaussian noise and nonlinearities, for which the construction of efficient decoding methods is challenging. Furthermore, the proposed method can be implemented using tensor computations and has an iterative structure suitable for deep-learning-specific hardware such as general purpose graphics processing units (GPGPUs) and AI accelerators. It also has high compatibility with machine learning techniques such as deep unfolding, and future research on practical applications is expected.

    Download PDF (2439K)
Review Papers
Proposed by BioX (Biometrics)
  • Shigeya SUZUKI, Kristina YASUDA, Naohiro FUJIE, Ryosuke ABE
    2024Volume 18Issue 1 Pages 42-55
    Published: July 01, 2024
    Released on J-STAGE: July 01, 2024
    JOURNAL FREE ACCESS

    Decentralized identifiers (DIDs) and verifiable credentials (VCs) are attracting attention as a new form of digital identity implementation. The traditional model concentrates on managing and providing user information based on solid trust in the entity providing identity services. The VC model shifts to a model in which the roles of the issuer, holder, verifier are separated. This model allows for fine grained control of digital identities. At the heart of this technology are the DID and VC data models, which are being actively developed and standardized along with various related technologies. This paper outlines the background, standardization, related protocols, application examples, issues, and discussion points regarding this emerging technology.

    Download PDF (1855K)
Proposed by US (Ultrasonics)
  • Hideyuki HASEGAWA
    2024Volume 18Issue 1 Pages 56-70
    Published: July 01, 2024
    Released on J-STAGE: July 01, 2024
    JOURNAL FREE ACCESS

    Delay-and-sum (DAS) beamforming is widely used to generate B-mode images using the echo signals obtained from transducer elements of ultrasound array probes. However, the spatial resolution and contrast provided by DAS beamforming are limited by the physical specifications of the array such as element spacing. To overcome such constraints, adaptive beamforming techniques have been actively researched and developed, particularly with the proliferation of digital and programmable ultrasound systems in recent years. On the other hand, it is also important to quantitatively evaluate whether the methods developed improve the characteristics of images. Recently, many adaptive beamforming techniques, which often alter the characteristics of ultrasound images, have been developed. Therefore, efforts have been made to improve quantitative methods for assessing image quality. In this article, we primarily review recent developments in adaptive beamforming and image quality evaluation methods.

    Download PDF (2764K)
Proposed by CCS (Complex Communication Sciences)
  • Tetsuya ASAI
    2024Volume 18Issue 1 Pages 71-78
    Published: July 01, 2024
    Released on J-STAGE: July 01, 2024
    JOURNAL FREE ACCESS

    The stochastic computing framework proposed in the 1960s allows certain arithmetic operators to be represented with very few elements, making it possible to construct massively parallel and power-saving hardware. However, the number of arithmetic operations that can be performed is limited, and the trade-off between arithmetic precision and the time required for the operation remains an issue. In recent years, attempts have been made to apply stochastic computing to AI, where the types of arithmetic operation are relatively limited and performance can be “reasonable” even with low precision. A possible solution to this problem has been found in recent years. Integrating these technologies will enable us to envision the future of edge AI hardware that can compute all the arithmetic elements and memory required for AI inference and learning using “stochastic inmemory computing”. In this paper, we describe those elemental technologies, solutions to issues and problems, and integrated AI architectures, and provide a bird's-eye view of real AI applications of stochastic computing.

    Download PDF (2551K)
Proposed by ICTSSL (Information and Communication Technologies for Safe and Secure Life)
  • Masahiro NISHI, Makoto KOBAYASHI, Koichi SHIN
    2024Volume 18Issue 1 Pages 79-87
    Published: July 01, 2024
    Released on J-STAGE: July 01, 2024
    JOURNAL FREE ACCESS

    Landslides caused by heavy rain are occurring frequently in various parts of Japan, and local residents continue to suffer. To reduce damage by landslides, a mechanism to encourage evacuation is necessary, and our research group has developed a landslide monitoring system in cooperation with local residents, which constantly provides images taken with an infrared camera. In this paper, we provide an overview of the landslide monitoring system and describe our trials to automatically detect the degree of risk by image analysis and soil moisture content estimation using radio waves, with the aim of detecting landslides more quickly. We also outline the research and development for constructing a landslide monitoring network in mountain areas.

    Download PDF (2694K)
Proposed by ISEC (Information Security)
ESS News
Let's go to IEICE Workshops!
Report
Winners' Voice
Call for Participations
Call for Papers
Committees & Editors Notes
feedback
Top