Skip to content

Commit

Permalink
fix prompt loop
Browse files Browse the repository at this point in the history
  • Loading branch information
jquesnelle committed Jul 10, 2023
1 parent feedbbe commit a716d78
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion prompt-loop.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ def main(args):

model = load_model(args.model, args.load_in_8bit,
args.load_in_4bit, args.max_tokens)
apply_patches(model, args.max_tokens, args.dynamic_ntk,
apply_patches(model, args.max_new_tokens, args.dynamic_ntk,
args.dynamic_linear, args.ntk, args.linear, args.part_ntk)

pipe = pipeline("text-generation", model=model, tokenizer=tokenizer, pad_token_id=tokenizer.eos_token_id,
Expand Down

0 comments on commit a716d78

Please sign in to comment.