[3Xin4-64] Discovery of different subnetworks according input data using HyperNetworks based on the Strong Lottery Ticket Hypothesis
Keywords:Deep Learning, Neural Network, Strong Lottery Ticket Hypothesis
The Strong Lottery Ticket Hypothesis (SLTH) is the hypothesis that "a randomly weighted neural network contains a subnetwork that performs well on a given a task". This hypothesis has now been proven to be satisfied under certain conditions. The edge-popup algorithm is a method to train a model based on this hypothesis. This method can find subnetworks with good performance without updating parameters, however there is a problem that the performance drops rapidly when the size of the subnetwork is reduced. This problem is due to the number of parameters used decreases, so as the model capacity decreases and performance degrades. To solve this problem, we experiment using HyperNetworks to find subnetworks by changing for each data, thereby minimizing the performance degradation. As a result, our proposed method has shown better training performance than the edge-popup algorithm when 50% of the parameters were used, however overfitting occurred and generalization performance decreased.
Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.