Presentation information

General Session

General Session » GS-2 Machine learning

[4G2-GS-2k] 機械学習:基礎理論

Fri. Jun 11, 2021 11:00 AM - 12:40 PM Room G (GS room 2)

座長:谷口 忠大(立命館大学)

11:40 AM - 12:00 PM

[4G2-GS-2k-03] Refined Consistency for Semi-Supervised Learning with Knowledge Distillation

〇Yoshitaka Muramoto1, Naoki Okamoto1, Tubasa Hirakawa1, Takayoshi Yamashita1, Hironobu Fujiyoshi1 (1. Chubu University)

Keywords:Semi-Supervised Learning

Semi-supervised learning is a method that uses both labeled and unlabeled data for training the model. Dual Student (DS), which transfers knowledge between two networks, and Multiple Student (MS), which expands the number of DS networks to four or more, have been proposed as semi-supervised learning. MS achieves higher accuracy than DS, but learning MS is inefficient because knowledge transfer between all networks is not performed at once in the MS learning. In this paper, we propose refined-consistency, which transfers knowledge between all networks at once, to improve accuracy through an efficient knowledge transfer method. In the experiment with the CIFAR-100 dataset, we show that the proposed method improves the accuracy more than MS.

Authentication for paper PDF access

A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.