人工知能学会全国大会論文集
Online ISSN : 2758-7347
36th (2022)
セッションID: 2S4-IS-2b-03
会議情報

Optimization of Convolutional Neural Network Using the Linearly Decreasing Weight Particle Swarm Optimization
*Tatsuki SERIZAWAHamido FUJITA
著者情報
会議録・要旨集 フリー

詳細
抄録

Convolutional neural network (CNN) is one of the most frequently used deep learning techniques. When learning with CNN, it is necessary to determine the optimal hyperparameters. A method that uses metaheuristic algorithms is attracting attention in research on hyperparameter optimization. In particular, particle swarm optimization converges faster than genetic algorithms, and various models have been proposed. In this paper, we propose CNN hyperparameter optimization with linearly decreasing weight particle swarm optimization (LDWPSO). In the experiment, by optimizing CNN hyperparameters with LDWPSO, learning the MNIST and CIFAR-10 datasets, we compare the accuracy with a standard CNN based on LeNet-5. As a result, when using the MNIST dataset, the baseline CNN is 94.02% at the 5th epoch, compared to 98.95% for LDWPSO CNN, which improves accuracy. When using the CIFAR-10 dataset, the Baseline CNN is 26.45% at the 9th epoch, compared to 69.53% for the LDWPSO CNN, which greatly improves accuracy.

著者関連情報
© 2022 The Japanese Society for Artificial Intelligence
前の記事 次の記事
feedback
Top