2023年度 人工知能学会全国大会(第37回)

講演情報

オーガナイズドセッション

オーガナイズドセッション » OS-29 Transfer learningの手法及び応用

[1Q5-OS-29] Transfer learningの手法及び応用

2023年6月6日(火) 17:00 〜 18:40 Q会場 (601)

オーガナイザ:顔 玉蘭、王 洋、Leo Mao、Yinxi Zhang、Junhua Wu

17:00 〜 17:20

[1Q5-OS-29-01] Leveraging BERT for Text Classification: A Hands-on Guide to Transfer Learning

Using Hugging Face to Fine-tune BERT for High-Performance Text Classification

〇Yang Wang1 (1. Databricks)

キーワード:Transfer Learning, NLP, Deep Learning

Transfer learning is a powerful technique that allows a model trained on one task to be fine-tuned on a different but related task. In this presentation, we will explore how to use transfer learning to perform text classification using the BERT model and it's variaty from HuggingFace. BERT (Bidirectional Encoder Representations from Transformers) is a pre-trained model that has been shown to achieve state-of-the-art results on a wide range of natural language understanding tasks. By fine-tuning BERT on a labeled dataset of text classification, we can quickly and easily train a high-performance model with minimal data and computational resources. We will demonstrate how to fine-tune BERT using the Hugging Face library and provide tips and best practices for getting the most out of this powerful technique. Attendees will leave with a solid understanding of how to use transfer learning for text classification and the knowledge to implement their own text classification models using BERT. We will also show how to rapidly implement this with open source MLflow and Transformers.

講演PDFパスワード認証
論文PDFの閲覧にはログインが必要です。参加登録者の方は「参加者用ログイン」画面からログインしてください。あるいは論文PDF閲覧用のパスワードを以下にご入力ください。

パスワード