4:30 PM - 4:50 PM
[2M5-OS-24-04] Neural Architecture Search for Multiple Architectures with Different Model Size Complexities
Keywords:Neural Architecture Search, Constrained Black-Box Optimization, Convolutional Neural Network, Probabilistic Model-based Optimization
For the utilization of deep neural networks in low-resource devices, several neural architecture search (NAS) methods search the architectures of deep neural networks that realize superior predictive performance under resource constraints, such as model size and latency. Moreover, increasing the variety of low-resource devices requires NAS methods to obtain multiple models under different constraints. This study proposes a NAS method for searching multiple architectures satisfying different model size constraints simultaneously. We update multiple categorical distributions following the stochastic relaxation technique with importance sampling. This update uses a few architectures generated from the mixture distribution, which reduces the search cost. Additionally, we introduce the penalty function method with coefficient adaptation to obtain architectures satisfying the model size constraints. We also introduce a sampling strategy to obtain architectures satisfying the constraints from the optimized distribution. The experimental results show that the proposed method achieved architectures with higher predictive performance under constraints.
Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.