3:50 PM - 4:10 PM
[3J4-OS-3b-02] Efficient Search of Multiple Architectures in Structure Complexity Aware Neural Architecture Search
[[Online]]
Keywords:Neural Architecture Search, Convolutional Neural Network, Importance Sampling
In neural architecture search (NAS) that searches the architectures of deep neural networks, methods for taking into account not only the prediction performance but also metrics related to the architecture complexity have been developed. This study aims to speed up the NAS method for optimizing the objective function defined as a weighted sum of two metrics such as the performance and number of parameters. The proposed method is based on one-shot NAS and optimizes the weight parameters in a super network only once. Then, we define multiple distributions for generating architectures with different complexity and update multiple distributions by utilizing samples from these mixture distributions based on importance sampling. In this way, we can obtain multiple architectures with different complexity in a single search and reduce the search cost. We apply the proposed method to the architecture search of convolutional neural networks and show that multiple architectures with different complexity can be obtained with less computational cost than the existing methods.
Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.