4:30 PM - 4:50 PM
[2L5-OS-19a-04] Robustness comparison of different learning rates of optimization algorithms in CNN models
Keywords:CNN, Optimization Algorithms
SGD and Adam are optimization algorithms commonly used for training DNN models. While Adam is favored over SGD in many applications, robustness performance has not been thoroughly studied. Particularly, differences in learning rates may result in varying robustness performance while the generalization performance is almost the same. In this paper, we investigate the robustness performance of each optimization algorithm using indicators based on the active neurons within the model. We generate models using SGD and Adam with four learning rates, apply noise to the test data inputs, and compare using three metrics. The results from our proposed method show that SGD exhibits lower robustness performance compared to Adam. Additionally, the models with lower active neuron rates exhibit lower robustness performance. These findings have the potential to establish benchmarks for robustness performance and aid in the development of future optimization algorithms.
Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.