Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixing a pathological case for slow tokenizers #14981

Merged
merged 2 commits into from
Dec 30, 2021

Conversation

Narsil
Copy link
Contributor

@Narsil Narsil commented Dec 29, 2021

What does this PR do?

Fixes issue found here for slow tokenizers:

huggingface/tokenizers#848

When using arbitrary tokens, a bug could occur where we would seed the new character, even when the lookahead attempted to skip over that part of the text. Thus we could have an extra match that didn't fit.

Other bug was during the lookahead we could undermatch (since we were iterating on new characters before checking termination).

The test added covers both cases.

Fixes # (issue)

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

Copy link
Collaborator

@sgugger sgugger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for fixing!

src/transformers/tokenization_utils.py Outdated Show resolved Hide resolved
@Narsil Narsil merged commit d7d60df into huggingface:master Dec 30, 2021
@Narsil Narsil deleted the fix_trie_pathological_case branch December 30, 2021 08:10
stevhliu pushed a commit to stevhliu/transformers that referenced this pull request Jan 6, 2022
* Fixing a pathological case for slow tokenizers

* Update src/transformers/tokenization_utils.py
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants