Host: The Japanese Society for Artificial Intelligence
Name : The 36th Annual Conference of the Japanese Society for Artificial Intelligence
Number : 36
Location : [in Japanese]
Date : June 14, 2022 - June 17, 2022
Recently, statistics-based methods on CNN features is employed for representing rich style information. One of these methods is Deep Correlation Features(DCF), the Gram matrix of vectorized feature maps, shown to be of benefit for paintings or affective imagery recognition. Calculating the inner product between two input feature maps, Gram matrix can be regarded as a sort of Attention mechanism, which adaptively changes the other one. To our knowledge, this is the first paper which reveals Deep Correlation Features in the aspect of Attention mechanism. Inspired by the idea of sparsification on Query-Key Attention, we propose Sparse Gram Matrix Module(SGMM). Our network is composed of two parts, multi-head SGMM and inter-layer concatenation. A network performance evaluation on classifying and retrieving anime-style artists showed superiority in closed-set accuracy metrics. Several characteristics of SGMM is discussed, which SGMM has similar behavior as Attention mechanism.