Abstract
Vector quantization is used for both storage and transmission of speech and image data, and is often required the algorithm that minimizes the distortion error. To obtain the minimum distortion error in neural networks for vector quantization, reformatory competitive learnings have been introduced. In a large number of algorithms, the self-creating neural network and self-deleting neural network are known for showing fine characters. In this paper, we improve the self-deleting neural network, and propose a generalization algorithm combining the creating and deleting neural networks. The algorithm is clarified as the following descriptions: At first, a few weight (reference) vectors are prepared, the algorithm is processed with the self-creating algorithm, and vectors are created automatically. Next, the algorithm is processed with the self-deleting algorithm, and weight vectors are deleted sequentially to the fixed number of them. Experimental results show the effectiveness of the proposed algorithm.