-
Notifications
You must be signed in to change notification settings - Fork 826
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Usage examples (pypi and readme) are outdated [BPETokenizer rename] #152
Comments
I got the same issue in the latest version. But after installing version
(python-37) user@user:~$ ipython
Python 3.7.6 | packaged by conda-forge | (default, Jan 7 2020, 22:33:48)
In [1]: from tokenizers import BPETokenizer
---------------------------------------------------------------------------
ImportError Traceback (most recent call last)
<ipython-input-1-812973393cae> in <module>
----> 1 from tokenizers import BPETokenizer
ImportError: cannot import name 'BPETokenizer' from 'tokenizers' (/home/user/anaconda3/envs/python-37/lib/python3.7/site-packages/tokenizers/__init__.py) |
Facing the same issue |
Ok, just looked at the changelogs, v0.3.0 did:
So, obviously importing BPETokenizer doesn't work. I'll change the name of the issue then... |
tsteffek
changed the title
Pip package V0.3.0 and up not working
Usage examples (pypi and readme) are outdated
Feb 16, 2020
tsteffek
changed the title
Usage examples (pypi and readme) are outdated
Usage examples (pypi and readme) are outdated [BPETokenizer rename]
Feb 16, 2020
Merged
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Edit: BPETokenizer has been renamed to CharBPETokenizer, however, all the examples still feature the BPETokenizer.
Original:
I just tried using the pypi package and it didn't work.
I opened a new virtual environment, installed tokenizers versions 0.4.2, 0.4.1, 0.4.0 and 0.3.0, started a Python console (using PyCharm, to be exact) and typed:
from tokenizers import BPETokenizer
.For all these versions it would tell me:
Tokenizers 0.2.1 works fine.
My machine:
The text was updated successfully, but these errors were encountered: