Proceedings of the Annual Conference of JSAI
Online ISSN : 2758-7347
37th (2023)
Session ID : 4I3-OS-1b-02
Conference information

Gradual Domain Adaptation with Generative Models
*Shogo SAGAWAHideitsu HINO
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

Conventional domain adaptation methods do not perform well in cases where there is a large gap between the source and target domains. One approach to tackle this problem is gradual domain adaptation. This technique leverages the intermediate domain that shifts gradually from the source domain to the target domain. The previous studies assumed that there are numerous intermediate domains with small distances between adjacent domains. This assumption makes the gradual domain adaptation algorithm involving self-training with unlabeled datasets applicable. However, in practice, gradual self-training fails due to a limited number of intermediate domains and large distances between adjacent domains. To address this issue while maintaining the framework of unsupervised domain adaptation, we propose using normalizing flows. We generate pseudo intermediate domains from normalizing flows and use them for gradual domain adaptation. We conduct experiments with real-world datasets to evaluate our proposed method and confirm that it mitigates the above-explained problem and improves classification performance.

Content from these authors
© 2023 The Japanese Society for Artificial Intelligence
Previous article Next article
feedback
Top