14:00 〜 14:20
[4A2-01] Gated Recurrent Neural Network with Tensor Product
キーワード:recurrent neural network, deep learning
In the machine learning fields, Recurrent Neural Network (RNNs) has become a primary choice for modeling sequential data such as text, speech, etc. To deal with long-term dependency in the long sequence, RNN utlizes gating mechanism to improve the gradient flow between multiple time-steps and avoid exploding/vanishing gradient problem. In the other hand, we would like to improve the representation power from RNN by using more expressive operation compared to standard matrix multiplication and summation. In this paper, we proposed a new RNN architecture with gating mechanism and tensor product between an input layer, a previous hidden layer, and a 3-rd rank tensor weight and we called it as gated recurrent neural tensor network (GRURNTN).