2025 Volume 40 Issue 2 Pages D-O96_1-10
While medical devices based on Artificial Intelligence (AI) are beginning to be approved in various countries/regions, there is concern over how to address biases inherent in big data and machine learning algorithms that could disadvantage specific patient groups. This study analyzes which sources of bias regulatory authorities in each countries/regions are focusing on in the governance of three types of unwanted bias in post-market updates of AI-based medical devices, and what requirements are assumed for risk management. Among the three types of unwanted bias, “Data bias” had many sources of bias that were commonly addressed by all three regions, Japan, the U.S., and Europe. On the other hand, many “human cognitive bias” sources were unaware of the bias, and it was found that there were different responses to the “bias introduced by engineering decisions”. By analyzing the considerations for unwanted bias by stage of the AI system life cycle, all countries confirmed the considerations for the deployment stage to be described. In the other stages, there were differences in the countries where the considerations were described, revealing that there are differences in the considerations at each life cycle stage among countries. Based on these results, we propose three measures to promote the development and approval of AI-based medical devices more fairly, with reduced unwanted bias, and globally. The first is to adapt to the differences in requirements for unwanted bias that exist between countries in order to expand internationally. The second is to utilize interdisciplinary experts when designing rule-based systems in order to ensure transparency about the possible existence of cognitive biases. The third is to disclose the results of research into cognitive biases toward groups. We believe that this measure, which addresses areas that current governance does not clearly set as requirements, will contribute to the early adoption of AI systems, which are currently evolving.