-
Notifications
You must be signed in to change notification settings - Fork 91
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GRU on top of ELMo embedding layer #12
Comments
I think it is beacuse the ElmoEmbeddingLayer uses the
|
Thanks @hambro – I had sketched out a working prototype a while back – I'll see if I can dig something up and get it up here when I get a bit of free time. |
I modified the elmo embedding layer as follow, and the output_shape is
The only problem is you have to explicitly give batch size to the This might cause the last batch might be smaller than the batch_size, and rise an error when the model reaches the end of epoch. |
Hi,
I replaced my embedding layer with the ELMo embedding layer. The code looks like this -
` embedding_layer = ElmoEmbeddingLayer()
But I am running to error - Input 0 is incompatible with layer gru: expected ndim=3, found ndim=2. The architecture worked well with the default embedding layer. Any idea what am I doing wrong?
The text was updated successfully, but these errors were encountered: