2025 Volume 16 Issue 3 Pages 777-786
Self-supervised learning (SSL) has emerged as a novel method for learning representations from inputs without data annotations. A key challenge in SSL is avoiding the collapse of representations, leading to poor predictive models. This research proposes an SSL framework that leverages quantum classifiers to address this issue. We achieve well-stratified, nontrivial representations by embedding data into the Hilbert space via quantum classifiers. The algorithm works by minimizing the Kullback-Leibler (KL) divergence to maximize the agreement of positive pairs and minimize the agreement of negative pairs. We demonstrate the proof-of-concept in image classification tasks using MNIST, KMNIST, and FashionMNIST datasets. Classifiers trained with our framework achieve higher predictive performance than classical classifiers, with a maximum accuracy of 65% on MNIST and more than 70% on KMNIST and FashionMNIST using unseen samples without fine-tuning. The code implementation is available at https://github.com/namhai03/QSSL