Transactions of the Japanese Society for Artificial Intelligence
Online ISSN : 1346-8030
Print ISSN : 1346-0714
ISSN-L : 1346-0714
Original Paper
Model Compression for ResNet via Layer Erasure and Re-training
Yasutoshi IdaYasuhiro Fujiwara
Author information
JOURNALS FREE ACCESS

2020 Volume 35 Issue 3 Pages C-JA3_1-10

Details
Abstract

Residual Networks with convolutional layers are widely used in the field of machine learning. Since they effectively extract features from input data by stacking multiple layers, they can achieve high accuracy in many applications. However, the stacking of many layers raises their computation costs. To address this problem, we propose Network Implosion, it erases multiple layers from Residual Networks without degrading accuracy. Our key idea is to introduce a priority term that identifies the importance of a layer; we can select unimportant layers according to the priority and erase them after the training. In addition, we retrain the networks to avoid critical drops in accuracy after layer erasure. Our experiments show that Network Implosion can, for classification on CIFAR10/100 and ImageNet, reduce the number of layers by 24.00% ~ 42.86% without any drop in accuracy.

Information related to the author
© The Japanese Society for Artificial Intelligence 2020
Previous article Next article
feedback
Top