16:40 〜 17:00
[2H4-E-2-05] Reducing the Number of Multiplications in Convolutional Recurrent Neural Networks (ConvRNNs)
キーワード:convolutional RNN, computational complexity reduction
Convolutional variants of recurrent neural networks, ConvRNNs, are widely used for spatio-temporal modeling. Although ConvRNNs are suited to model two-dimensional sequences, the introduction of convolution operation brings additional parameters and increases the computational complexity. The computation load can be obstacles in putting ConvRNNs in operation in real-world applications. We propose to reduce the number of parameters and multiplications by substituting some convolutiona operations with the Hadamard product. We evaluate our proposal using the task of next video frame prediction and the Moving MNIST dataset. The proposed method requires 38% less multiplications and 21% less parameters compared to the fully convolutional counterpart. In price of the reduced computational complexity, the performance measured by for structural similarity index measure (SSIM) decreased about 1.5%. ConvRNNs with reduced computations can be used in more various situations likein web apps or embedded systems.