Host: The Japanese Society for Artificial Intelligence
Name : The 37th Annual Conference of the Japanese Society for Artificial Intelligence
Number : 37
Location : [in Japanese]
Date : June 06, 2023 - June 09, 2023
Conventional domain adaptation methods do not perform well in cases where there is a large gap between the source and target domains. One approach to tackle this problem is gradual domain adaptation. This technique leverages the intermediate domain that shifts gradually from the source domain to the target domain. The previous studies assumed that there are numerous intermediate domains with small distances between adjacent domains. This assumption makes the gradual domain adaptation algorithm involving self-training with unlabeled datasets applicable. However, in practice, gradual self-training fails due to a limited number of intermediate domains and large distances between adjacent domains. To address this issue while maintaining the framework of unsupervised domain adaptation, we propose using normalizing flows. We generate pseudo intermediate domains from normalizing flows and use them for gradual domain adaptation. We conduct experiments with real-world datasets to evaluate our proposed method and confirm that it mitigates the above-explained problem and improves classification performance.