2025 Volume 29 Issue 6 Pages 1565-1576
This paper proposes a revised margin-maximization method for training nearest prototype classifiers (NPCs), which are known as an explainable supervised learning model. The margin-maximization method of our previous study formulates NPC training as a difference-of-convex (DC) programming problem solved via the convex-concave procedure. However, it suffers from issues related to hyperparameter sensitivity and its inability to simultaneously optimize both classification and clustering performances. To overcome these drawbacks, the revised method directly solves the margin-maximization problem using a method of sequential second-order cone programming, without DC programming reduction. Furthermore, it integrates clustering loss from the k-means method into the objective function to enhance prototype placement in dense data regions. We prove that the revised method is a descent algorithm, that is, the objective function decreases in each update of the solution. A numerical study confirms that the revised method addresses the drawbacks of the previous method.
This article cannot obtain the latest cited-by information.