Host: The Japanese Society for Artificial Intelligence
Name : 34th Annual Conference, 2020
Number : 34
Location : Online
Date : June 09, 2020 - June 12, 2020
Deep learning without enough volume of data degrades learning performance due to over-fitting. Although data augmentation can reduce over-fitting effects, excessive one causes performance degression, called over-regularization. To solve these problems, we have proposed a data augmentation method which can avoid over-regularization by considering shape of distribution, and shown its effectiveness for classification. In this research, we extend this data augmentation method to regression. The extension is based on the celebrated Adaptive Truncated Residual Networks (ATR-Nets). ATR-Nets is comprised by classification and regression parts. Under a situation where output space is represented by discrete points (anchors), the classification part judges which anchor corresponds to output. Then the regression part predicts the difference between output and its corresponding anchor. We incorporate our data augmentation method into the parts using the anchors. The effectiveness of this idea has been verified in comparison with models where no data-augmentation and gaussian-based one are conducted.