JSAI2023

Presentation information

Organized Session

Organized Session » OS-29

[1Q5-OS-29] Transfer learningの手法及び応用

Tue. Jun 6, 2023 5:00 PM - 6:40 PM Room Q (601)

オーガナイザ:顔 玉蘭、王 洋、Leo Mao、Yinxi Zhang、Junhua Wu

5:00 PM - 5:20 PM

[1Q5-OS-29-01] Leveraging BERT for Text Classification: A Hands-on Guide to Transfer Learning

Using Hugging Face to Fine-tune BERT for High-Performance Text Classification

〇Yang Wang1 (1. Databricks)

Keywords:Transfer Learning, NLP, Deep Learning

Transfer learning is a powerful technique that allows a model trained on one task to be fine-tuned on a different but related task. In this presentation, we will explore how to use transfer learning to perform text classification using the BERT model and it's variaty from HuggingFace. BERT (Bidirectional Encoder Representations from Transformers) is a pre-trained model that has been shown to achieve state-of-the-art results on a wide range of natural language understanding tasks. By fine-tuning BERT on a labeled dataset of text classification, we can quickly and easily train a high-performance model with minimal data and computational resources. We will demonstrate how to fine-tune BERT using the Hugging Face library and provide tips and best practices for getting the most out of this powerful technique. Attendees will leave with a solid understanding of how to use transfer learning for text classification and the knowledge to implement their own text classification models using BERT. We will also show how to rapidly implement this with open source MLflow and Transformers.

Authentication for paper PDF access

A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.

Password