IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
Online ISSN : 1745-1337
Print ISSN : 0916-8508
Special Section on Smart Multimedia & Communication Systems
Deep Gaussian Denoising Network Based on Morphological Operators with Low-Precision Arithmetic
Hikaru FUJISAKIMakoto NAKASHIZUKA
Author information
JOURNAL RESTRICTED ACCESS

2022 Volume E105.A Issue 4 Pages 631-638

Details
Abstract

This paper presents a deep network based on morphological filters for Gaussian denoising. The morphological filters can be applied with only addition, max, and min functions and require few computational resources. Therefore, the proposed network is suitable for implementation using a small microprocessor. Each layer of the proposed network consists of a top-hat transform, which extracts small peaks and valleys of noise components from the input image. Noise components are iteratively reduced in each layer by subtracting the noise components from the input image. In this paper, the extensions of opening and closing are introduced as linear combinations of the morphological filters for the top-hat transform of this deep network. Multiplications are only required for the linear combination of the morphological filters in the proposed network. Because almost all parameters of the network are structuring elements of the morphological filters, the feature maps and parameters can be represented in short bit-length integer form, which is suitable for implementation with single instructions, multiple data (SIMD) instructions. Denoising examples show that the proposed network obtains denoising results comparable to those of BM3D [1] without linear convolutions and with approximately one tenth the number of parameters of a full-scale deep convolutional neural network [2]. Moreover, the computational time of the proposed method using SIMD instructions of a microprocessor is also presented.

Content from these authors
© 2022 The Institute of Electronics, Information and Communication Engineers
Previous article Next article
feedback
Top