You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
3.1.3 I was slightly confused by the 'max_len' argument; wasn't sure if this was somehow related to batch size or the sequence length itself. Maybe clarifying this + adding a brief comment about the padding/slicing it does if len != maxlen
Will update this as I go along..
The text was updated successfully, but these errors were encountered:
Question on embedding size: Once this has been set, does it remain the same despite the size of the input vector? I am thinking in regards to NLP, where one sentence might be 6 words, and the next could be 13. If embed_size = 32 for example, would this stay static throughout any type of inference length? Does it need to be larger than the largest possible input matrix?
Will update this as I go along..
The text was updated successfully, but these errors were encountered: