We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
请问我在使用Yi-1.5-9B-Chat进行推理时,出现token间没有space的情况,这是bug吗?
transformers=4.47
outputs = self.model(input_ids=input_ids, attention_mask=attention_mask, use_cache=True) next_token_logits = outputs.logits[:, -1, :] / temp next_token_logits = next_token_logits.to(self.model.device) next_token_logits = self.filter_forbidden_tokens(next_token_logits) next_token_logits = logits_processor(input_ids, next_token_logits) output_tokens += 1 step += 1 next_token_id = torch.multinomial(F.softmax(next_token_logits, dim=-1), num_samples=1) next_word = self.tokenizer.decode(next_token_id[0], skip_special_tokens=True) response += next_word
The text was updated successfully, but these errors were encountered:
谢谢我解决了这个问题,通过修改tokenizer_config.json中add_prefix_space为false
参考:#36
Sorry, something went wrong.
No branches or pull requests
请问我在使用Yi-1.5-9B-Chat进行推理时,出现token间没有space的情况,这是bug吗?
transformers=4.47
The text was updated successfully, but these errors were encountered: