JSAI2022

Presentation information

General Session

General Session » GS-2 Machine learning

[1D1-GS-2] Machine learning: algorithm

Tue. Jun 14, 2022 10:00 AM - 11:20 AM Room D (Room D)

座長:唐木田 亮(産業技術総合研究所)[現地]

10:20 AM - 10:40 AM

[1D1-GS-2-02] Scheduling of Damping in Natural Gradient Method

〇Hiroki Naganuma1,2, Gaku Fujimori3, Mari Takeuchi4, Jumpei Nagase5 (1. University of Montreal, 2. Mila, 3. Tokyo University of Science, 4. University College London, 5. Shibaura Institute of Technology)

[[Online]]

Keywords:Deep Learning, Second-Order Optimization, Damping

In recent years, second-order optimization with a fast convergence rate has been used in deep learning owing to fast approximation methods for natural gradient methods. Second-order optimization requires the inverse computation of the information matrix, which generally degenerates in the deep learning problem. Therefore, as a heuristic, a damping method adds a unit matrix multiplied by a constant. This study proposed a method for scheduling damping motivated by the Levenberg-Marquardt method for determining damping and investigated its effectiveness.

Authentication for paper PDF access

A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.

Password