Host: The Japanese Society for Artificial Intelligence
Name : 34th Annual Conference, 2020
Number : 34
Location : Online
Date : June 09, 2020 - June 12, 2020
This paper proposes a gradient-based architecture search method for deep multimodal neural networks. Differentiable Architecture Search (DARTS), which is a gradient-based architecture search method, enables efficient architecture search of neural networks using a gradient descent method by defining the continuous search space. The proposed method is an extension of DARTS and specialized for deep multimodal neural networks. The proposed method can deal with variable-length sequential input data because it includes a Long Short-Term Memory (LSTM) as one of operators. Experiments with the emotion recognition dataset that includes time-series data have shown that the proposed method searched for the architecture that has competitive performance with the network manually designed in the previous work.