Journal of Digital Dentistry
Online ISSN : 2432-7654
Original Article
Possibility of Generative Pre-Trained Transformer Model for dental education
Tsukasa YanagiAyako Yanagi (Sato)Yusuke TaniguchiAyako MatsumotoKanae NegoroSeiichi FujisakiKae KakuraHirofumi Kido
Author information
JOURNAL FREE ACCESS

2024 Volume 13 Issue 3 Pages 91-97

Details
Abstract

 Recently, large language models have been considered for various applications.

 In this study, we examined the possibility of using the GPT (Generative Pre-Trained Transformer) model as a dental education model by measuring the amount of knowledge in the field of dentistry by having the model solve the National Dental Examination. The questions from the 114th to 116th national examinations were answered on GPT3.5 and GPT4, and the percentage of correct answers was compared with the passing standard. The percentage of correct answers by field was also calculated. GPT3.5 could not reach the passing standard in all areas, and GPT4 reached the passing standard in the required and A areas, but not in B and C areas. In addition, the percentage of correct answers to general medicine questions was high, but the percentage of correct answers to dental field questions was low. These results suggest that GPT3.5 and GPT4 are not suitable as dental education models.

Content from these authors
© 2024 Japan Academy of Digital Dentistry
Next article
feedback
Top