Host: The Japanese Society for Artificial Intelligence
Name : The 37th Annual Conference of the Japanese Society for Artificial Intelligence
Number : 37
Location : [in Japanese]
Date : June 06, 2023 - June 09, 2023
Transfer learning is a powerful technique that allows a model trained on one task to be fine-tuned on a different but related task. In this presentation, we will explore how to use transfer learning to perform text classification using the BERT model and it's variaty from HuggingFace. BERT (Bidirectional Encoder Representations from Transformers) is a pre-trained model that has been shown to achieve state-of-the-art results on a wide range of natural language understanding tasks. By fine-tuning BERT on a labeled dataset of text classification, we can quickly and easily train a high-performance model with minimal data and computational resources. We will demonstrate how to fine-tune BERT using the Hugging Face library and provide tips and best practices for getting the most out of this powerful technique. Attendees will leave with a solid understanding of how to use transfer learning for text classification and the knowledge to implement their own text classification models using BERT. We will also show how to rapidly implement this with open source MLflow and Transformers.