Host: The Japanese Society for Artificial Intelligence
Name : The 37th Annual Conference of the Japanese Society for Artificial Intelligence
Number : 37
Location : [in Japanese]
Date : June 06, 2023 - June 09, 2023
Recent theoretical and empirical studies show that the existence of strong lottery tickets (SLT), i.e., sparse subnetworks that achieve high performance \textit{without any weights updates}, in randomly initialized over-parameterized networks. However, little is known about how these SLT are discovered by the de-facto edge-popup algorithm (EP), making it difficult to improve its performance. In this paper, we first show that EP suffers from \textit{the dying edge problem}, i.e., most weights are \textit{never} used during the entire search process, suggesting that the edge-pop algorithm only searches around the randomly selected initial subnetworks. We then propose a \textit{iterative edge-pop (iEP)}, which repeats the EP while gradually increasing the pruning rate and rewinding the learning rate after each iteration. To validate the effectiveness of the proposed method, we conducted experiments using ImageNet, CIFAR-10, and CIFAR-100 datasets. As a result, we achieved a performance of 76.0\% with approximately 20 million parameters using iEP, while the regular weight had 22 million parameters with 73.3\% accuracy and Edge-Pop had about 20 million parameters with 73.3\% accuracy on ImageNet. Our results also provide new insight into why iterative pruning often helps to find good sparse networks.