JSAI2024

Presentation information

Organized Session

Organized Session » OS-19

[2L5-OS-19a] OS-19

Wed. May 29, 2024 3:30 PM - 5:10 PM Room L (Room 52)

オーガナイザ:磯部 祥尚(産業技術総合研究所)、中島 震(放送大学・国立情報学研究所)、小林 健一(富士通株式会社)

4:30 PM - 4:50 PM

[2L5-OS-19a-04] Robustness comparison of different learning rates of optimization algorithms in CNN models

Yuto Yokoyama1, 〇Kozo Okano1, Shinpei Ogata1, Shin Nakajima2 (1. Shinshu-University, 2. National Institute of Informatics)

Keywords:CNN, Optimization Algorithms

SGD and Adam are optimization algorithms commonly used for training DNN models. While Adam is favored over SGD in many applications, robustness performance has not been thoroughly studied. Particularly, differences in learning rates may result in varying robustness performance while the generalization performance is almost the same. In this paper, we investigate the robustness performance of each optimization algorithm using indicators based on the active neurons within the model. We generate models using SGD and Adam with four learning rates, apply noise to the test data inputs, and compare using three metrics. The results from our proposed method show that SGD exhibits lower robustness performance compared to Adam. Additionally, the models with lower active neuron rates exhibit lower robustness performance. These findings have the potential to establish benchmarks for robustness performance and aid in the development of future optimization algorithms.

Authentication for paper PDF access

A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.

Password