Proceedings of the Annual Conference of JSAI
Online ISSN : 2758-7347
36th (2022)
Session ID : 3J4-OS-3b-01
Conference information

Stopping criterion for Neural Architecture Search
*Kotaro Sakamoto SAKAMOTOHideaki ISHIBASHIRei SATOShinichi SHIRAKAWAYohei AKIMOTOHideitsu HINO
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

Neural architecture search (NAS) is a framework for automating the design process of a neural network structure. While the recent one-shot approaches have reduced the search cost, there still exists an inherent trade-off between cost and performance. It is important to appropriately stop the search and further minimise the high cost of NAS. On the other hand, heuristic early-stopping strategies have been proposed to overcome the well-known performance degradation of the one-shot approach, particularly differentiable architecture search (DARTS). In this paper, we propose a more versatile and principled early-stopping criterion on the basis of the evaluation of a gap between expectation values of generalisation errors of the previous and current search steps with respect to the architecture parameters. The stopping threshold is automatically determined at each search epoch without cost. In numerical experiments, we demonstrate the effectiveness of the proposed method. We stop the one-shot NAS algorithms such as ASNG-NAS and DARTS and evaluate the acquired architectures on the benchmark datasets: NAS-Bench-201 and NATS-Bench. Our algorithm has been shown to reduce the cost of the search process while maintaining a high performance.

Content from these authors
© 2022 The Japanese Society for Artificial Intelligence
Previous article Next article
feedback
Top