IEICE Transactions on Information and Systems
Online ISSN : 1745-1361
Print ISSN : 0916-8532
Regular Section
Distilling Distribution Knowledge in Normalizing Flow
Jungwoo KWONGyeonghwan KIM
著者情報
ジャーナル フリー

2023 年 E106.D 巻 8 号 p. 1287-1291

詳細
抄録

In this letter, we propose a feature-based knowledge distillation scheme which transfers knowledge between intermediate blocks of teacher and student with flow-based architecture, specifically Normalizing flow in our implementation. In addition to the knowledge transfer scheme, we examine how configuration of the distillation positions impacts on the knowledge transfer performance. To evaluate the proposed ideas, we choose two knowledge distillation baseline models which are based on Normalizing flow on different domains: CS-Flow for anomaly detection and SRFlow-DA for super-resolution. A set of performance comparison to the baseline models with popular benchmark datasets shows promising results along with improved inference speed. The comparison includes performance analysis based on various configurations of the distillation positions in the proposed scheme.

著者関連情報
© 2023 The Institute of Electronics, Information and Communication Engineers
前の記事 次の記事
feedback
Top