-
Notifications
You must be signed in to change notification settings - Fork 529
PyTorch attention cells, transformer, bert and conversion script #1532
Conversation
The documentation website for preview: http://gluon-nlp-staging.s3-accelerate.dualstack.amazonaws.com/PR1532/c52e64eedb4eb6623a4455f96ad62c6c0bdd783e/index.html |
The documentation website for preview: http://gluon-nlp-staging.s3-accelerate.dualstack.amazonaws.com/PR1532/7459a3aa9eb30b50faf35df7eed6610b54a3ddd7/index.html |
The documentation website for preview: http://gluon-nlp-staging.s3-accelerate.dualstack.amazonaws.com/PR1532/07aced5704fef8022408ea30bb615396320e9af8/index.html |
Codecov Report
@@ Coverage Diff @@
## master #1532 +/- ##
==========================================
- Coverage 86.37% 85.79% -0.59%
==========================================
Files 55 55
Lines 7522 7426 -96
==========================================
- Hits 6497 6371 -126
- Misses 1025 1055 +30
Continue to review full report at Codecov.
|
The documentation website for preview: http://gluon-nlp-staging.s3-accelerate.dualstack.amazonaws.com/PR1532/dbf8ce79df53f68340a2de49b72d565a9c2a64ff/index.html |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll approve. We can improve the testing later since this is the initial version of the pytorch-based GluonNLP.
Add PyTorch attention cells, transformer, bert and conversion script. Correctness verified via included tests and conversion script.