Skip to content

Commit

Permalink
Style and quality fix
Browse files Browse the repository at this point in the history
  • Loading branch information
pokjay committed Sep 5, 2023
1 parent 7470549 commit e27c496
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions src/transformers/generation/logits_process.py
Original file line number Diff line number Diff line change
Expand Up @@ -1297,9 +1297,9 @@ def __call__(self, input_ids: torch.LongTensor, scores: torch.FloatTensor) -> to

class ExponentialDecayLengthPenalty(LogitsProcessor):
r"""
[`LogitsProcessor`] that exponentially increases the score of the `eos_token_id` after `start_index` has been reached.
This allows generating shorter sequences without having a hard cutoff, allowing the `eos_token` to be predicted in a
meaningful position.
[`LogitsProcessor`] that exponentially increases the score of the `eos_token_id` after `start_index` has been
reached. This allows generating shorter sequences without having a hard cutoff, allowing the `eos_token` to be
predicted in a meaningful position.
Args:
exponential_decay_length_penalty (`tuple(int, float)`):
Expand Down

0 comments on commit e27c496

Please sign in to comment.