JSAI2025

Presentation information

General Session

General Session » GS-2 Machine learning

[3S1-GS-2] Machine learning:

Thu. May 29, 2025 9:00 AM - 10:40 AM Room S (Room 701-2)

座長:北岡 旦(NEC)

9:00 AM - 9:20 AM

[3S1-GS-2-01] Investigating the Impact of Supernet Pretraining in Multi-Task One-Shot NAS

〇Yotaro Yamaguchi1, Yuki Tanigaki1 (1. Osaka Institute of Technology)

Keywords:Neural Architecture Search, Multi task, Deep learning, Multi-objective optimization

One-Shot Neural Architecture Search (NAS) reduced computational cost by pre-training a supernet and sharing its weights across subnetworks during the search process. However, its adaptation to multi-task learning remained challenging. In this study, we investigated the impact of supernet pre-training on multi-task search performance. Specifically, we examined the effectiveness of using a supernet trained on one task as a warm start for architecture search on a different task. Furthermore, we analyzed the effect of updating the supernet weights by training it on the current search task after the warm start. Our experiments aimed to determine whether a pre-trained supernet improved search efficiency and whether additional adaptation during the search process enhanced architecture performance. The results demonstrated that transferring a supernet across tasks improved search performance. Additionally, further training on the current task significantly enhanced search effectiveness.

Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.

Password