Proceedings of the Annual Conference of JSAI
Online ISSN : 2758-7347
38th (2024)
Session ID : 2L6-OS-19b-05
Conference information

Estimating Generalization Error Bounds for Worst Weight-Perturbed Neural Classifiers
*Yoshinao ISOBE
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

For evaluating neural classifiers, evaluation indicators, such as accuracy, precision, and recall, for datasets are widely used, but it is difficult to guarantee performance for unseen data not included in the datasets by such indicators. In this presentation, we propose a method to statistically guarantee the upper bounds of the expected-values (i.e. generalization errors) of misclassification rates in worst weight-perturbed classifiers for any input data including unseen data. Here, the worst weight-perturbations represent perturbations imposed on weight-parameters to misclassify, if possible, within given perturbation ranges. Such upper bounds can be estimated by randomly selected perturbations, but it is difficult in general to detect such worst weight-perturbations by random selection. Therefore, we combine random selection with gradient-based search for making the proposed method practical and reasonable. We experimentally demonstrate that the method can estimate the generalization error bounds of worst weight-perturbed classifiers and consider the usefulness of the method for evaluating classifiers.

Content from these authors
© 2024 The Japanese Society for Artificial Intelligence
Previous article Next article
feedback
Top