Proceedings of the Annual Conference of JSAI
Online ISSN : 2758-7347
37th (2023)
Session ID : 2A4-GS-2-04
Conference information

Pruning Randomly Connected Neural Networks by Iterative Edge-Pop
*Yusuke IWASAWAMasato HIRAKAWAYutaka MATSUO
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

Recent theoretical and empirical studies show that the existence of strong lottery tickets (SLT), i.e., sparse subnetworks that achieve high performance \textit{without any weights updates}, in randomly initialized over-parameterized networks. However, little is known about how these SLT are discovered by the de-facto edge-popup algorithm (EP), making it difficult to improve its performance. In this paper, we first show that EP suffers from \textit{the dying edge problem}, i.e., most weights are \textit{never} used during the entire search process, suggesting that the edge-pop algorithm only searches around the randomly selected initial subnetworks. We then propose a \textit{iterative edge-pop (iEP)}, which repeats the EP while gradually increasing the pruning rate and rewinding the learning rate after each iteration. To validate the effectiveness of the proposed method, we conducted experiments using ImageNet, CIFAR-10, and CIFAR-100 datasets. As a result, we achieved a performance of 76.0\% with approximately 20 million parameters using iEP, while the regular weight had 22 million parameters with 73.3\% accuracy and Edge-Pop had about 20 million parameters with 73.3\% accuracy on ImageNet. Our results also provide new insight into why iterative pruning often helps to find good sparse networks.

Content from these authors
© 2023 The Japanese Society for Artificial Intelligence
Previous article Next article
feedback
Top