Gate-Variants of Gated Recurrent Unit (GRU) Neural Networks
Abstract
The paper evaluates three variants of the Gated Recurrent Unit (GRU) in recurrent neural networks (RNN) by reducing parameters in the update and reset gates. We evaluate the three variant GRU models on MNIST and IMDB datasets and show that these GRU-RNN variant models perform as well as the original GRU RNN model while reducing the computational expense.
- Publication:
-
arXiv e-prints
- Pub Date:
- January 2017
- DOI:
- 10.48550/arXiv.1701.05923
- arXiv:
- arXiv:1701.05923
- Bibcode:
- 2017arXiv170105923D
- Keywords:
-
- Computer Science - Neural and Evolutionary Computing;
- Statistics - Machine Learning
- E-Print:
- 5 pages, 8 Figures, 4 Tables