[3Rin4-24] Construction of Residual Skip Connection by ReLU Perceptron and Mathematical Analysis Based on Representation Set.
Keywords:Deep Learning, Representation power, Residual Skip Connection, Mathematical Analysis, Model Design
The purpose of this study is to provide a systematic theory and mathematical analysis for the design of DNNs with skip-connections.
In the past, DNN performance evaluations were often based on experimental results that depended on data and tasks, and it was unclear how differences in model structure, such as skip connections, would affect.
To solve this problem, we analyze fundamental and interpretable nature by a representation set.
As a result, it was shown that the basic residual form of the skip-connection can be understood as a parameter restriction of simple wider DNN with ReLU activation.
We also showed that this restriction corresponds the recently proposed parametric ReLU activation.
This result contributes to the systematization of the design of the DNN model.
In the past, DNN performance evaluations were often based on experimental results that depended on data and tasks, and it was unclear how differences in model structure, such as skip connections, would affect.
To solve this problem, we analyze fundamental and interpretable nature by a representation set.
As a result, it was shown that the basic residual form of the skip-connection can be understood as a parameter restriction of simple wider DNN with ReLU activation.
We also showed that this restriction corresponds the recently proposed parametric ReLU activation.
This result contributes to the systematization of the design of the DNN model.
Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.