Host: The Japanese Society for Artificial Intelligence
Name : The 37th Annual Conference of the Japanese Society for Artificial Intelligence
Number : 37
Location : [in Japanese]
Date : June 06, 2023 - June 09, 2023
Neural Architecture Search (NAS) aims to automatically design the structure of a deep neural network in an exploratory manner, which is conventionally designed by experts. Weight sharing (WS)-based NAS is known to be a time-efficient NAS approach because it simultaneously learns the neural network structure and weight parameters in a single training session. However, WS-based NAS has been observed to have a problem that the search may converge to an architecture with significantly low final performance depending on the design of the search space --- the set of possible combinations of operations. The design of an appropriate search space is task-dependent, and the user must carefully design it, which hinders the application of WS-type NAS. In this study, we show a simple modification to the aggregation operation in the search space helps to mitigate the above mentioned issue.