2024 Volume 18 Issue 1 Pages 71-78
The stochastic computing framework proposed in the 1960s allows certain arithmetic operators to be represented with very few elements, making it possible to construct massively parallel and power-saving hardware. However, the number of arithmetic operations that can be performed is limited, and the trade-off between arithmetic precision and the time required for the operation remains an issue. In recent years, attempts have been made to apply stochastic computing to AI, where the types of arithmetic operation are relatively limited and performance can be “reasonable” even with low precision. A possible solution to this problem has been found in recent years. Integrating these technologies will enable us to envision the future of edge AI hardware that can compute all the arithmetic elements and memory required for AI inference and learning using “stochastic inmemory computing”. In this paper, we describe those elemental technologies, solutions to issues and problems, and integrated AI architectures, and provide a bird's-eye view of real AI applications of stochastic computing.