JSAI2022

Presentation information

Interactive Session

General Session » Interactive Session

[3Yin2] Interactive session 1

Thu. Jun 16, 2022 11:30 AM - 1:10 PM Room Y (Event Hall)

[3Yin2-15] Analysis of feature distribution using KLD between inputs and intermediate layer outputs in revetment image

〇Ryuto Yoshida1, Yukino Tsuzuki1, Junichi Okubo1, Junichirou Fujii1, Takayoshi Yamashita2 (1.Yachiyo Engineering Co., Ltd., 2.Chubu University)

Keywords:KL divergence, Batch Normalization, standardization

Scaling input data such as standardization and regularization is a common technique in machine learning. Scaling reduces the difference in the distribution of the input data. Batch Normalization has the purpose of scaling the distribution of features after convolution. Generally, the output of the Batch Normalization layer is converted non-linearly by the activation function. Therefore, the distribution of features through the Batch Normalization layer affects the propagation of features. The task of this study is the crack segmentation of the revetment. The Batch Normalization layer output is evaluated using KL divergence, which is an index for measuring the difference in probability distribution. Then, the correlation between the feature distribution of the input image and KL divergence is analyzed.

Authentication for paper PDF access

A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.

Password