ISIJ International
Online ISSN : 1347-5460
Print ISSN : 0915-1559
ISSN-L : 0915-1559
Regular Article
SOKD: A Soft Optimization Knowledge Distillation Scheme for Surface Defects Identification of Hot-Rolled Strip
Wenyan WangZheng RenCheng WangKun LuTao TaoXuejuan PanBing Wang
Author information
JOURNAL OPEN ACCESS FULL-TEXT HTML

2025 Volume 65 Issue 1 Pages 104-110

Details
Abstract

The surface defect of hot-rolled strip is a significant factor that impacts the performance of strip products. In recent years, convolutional neural networks (CNNs) have been extensively used in strip surface defect recognition to ensure product quality. However, the existing CNNs-based methods confront the challenges of high complexity, difficult deployment and slow inference speed. Accordingly, this work proposes a soft optimization knowledge distillation (SOKD) scheme to distill the ResNet-152 large model and extract a compact strip surface recognition model. The SOKD scheme utilizes Kullback-Leibler (KL) divergence to minimize the error between the soft probability distributions of the student network and the teacher network, and gradually reduces the weight of “Hard loss” during the training process. The operation significantly reduces the learning constraints that the prior knowledge of the teacher network on the student network in the original KD, which improves the recognition performance of the model. Additionally, SOKD is applicable to most CNNs for identify surface defect of hot-rolled strip. The experimental results on NEU-CLS dataset show that the SOKD outperforms state-of-the-art methods.

Fullsize Image
Content from these authors
© 2025 The Iron and Steel Institute of Japan.

This is an open access article under the terms of the Creative Commons Attribution-NonCommercial-NoDerivs license.
https://creativecommons.org/licenses/by-nc-nd/4.0/
Previous article Next article
feedback
Top