IEICE Transactions on Information and Systems
Online ISSN : 1745-1361
Print ISSN : 0916-8532
Regular Section
Ultra-Fast NAS Based on Normalized Generalization Error with Random NTK
Keigo WAKAYAMATakafumi KANAMORI
Author information
JOURNAL FREE ACCESS

2025 Volume E108.D Issue 10 Pages 1194-1205

Details
Abstract

Neural architecture search (NAS) is very useful for automating the design of DNN architectures. In recent years, a number of methods for training-free NAS have been proposed, and reducing search cost has raised expectations for real-world applications. In a state-of-the-art (SOTA) training-free NAS based on theoretical background, i.e., NASI, however, the proxy for estimating the test performance of candidate architectures is based on the training error, not the generalization error. In this research, we propose a NAS based on a proxy theoretically derived from the bias-variance decomposition of the normalized generalization error, called NAS-NGE, i.e., NAS based on normalized generalization error. Specifically, we propose a surrogate of the normalized 2nd order moment of Neural Tangent Kernel (NTK) and use it together with the normalized bias to construct NAS-NGE. We use NAS Benchmarks and DARTS search space to demonstrate the effectiveness of the proposed method by comparing it to SOTA training-free NAS in a short search time.

Content from these authors
© 2025 The Institute of Electronics, Information and Communication Engineers
Previous article Next article
feedback
Top