ISIJ International
Online ISSN : 1347-5460
Print ISSN : 0915-1559
ISSN-L : 0915-1559

This article has now been updated. Please use the final version.

SOKD: A soft optimization knowledge distillation scheme for surface defects identification of hot-rolled strip
Wenyan WangZheng RenCheng WangKun LuTao TaoXuejuan PanBing Wang
Author information
JOURNAL OPEN ACCESS Advance online publication

Article ID: ISIJINT-2024-159

Details
Abstract

The surface defect of hot-rolled strip is a significant factor that impacts the performance of strip products. In recent years, convolutional neural networks (CNNs) have been extensively used in strip surface defect recognition to ensure product quality. However, the existing CNNs-based methods confront the challenges of high complexity, difficult deployment and slow inference speed. Accordingly, this work proposes a soft optimization knowledge distillation (SOKD) scheme to distill the ResNet-152 large model and extract a compact strip surface recognition model. The SOKD scheme utilizes Kullback-Leibler (KL) divergence to minimize the error between the soft probability distributions of the student network and the teacher network, and gradually reduces the weight of "Hard loss" during the training process. The operation significantly reduces the learning constraints that the prior knowledge of the teacher network on the student network in the original KD, which improves the recognition performance of the model. Additionally, SOKD is applicable to most CNNs for identify surface defect of hot-rolled strip. The experimental results on NEU-CLS dataset show that the SOKD outperforms state-of-the-art methods.

Content from these authors
© 2024 The Iron and Steel Institute of Japan

This is an open access article under the terms of the Creative Commons Attribution-NonCommercial-NoDerivs license
https://creativecommons.org/licenses/by-nc-nd/4.0/
feedback
Top