Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BFCL] Fix Llama Handler #626

Merged
merged 3 commits into from
Sep 9, 2024
Merged

Conversation

HuanzhiMao
Copy link
Collaborator

According to the llama chat template on huggingface, the _format_prompt method in LlamaHandler is missing two \n after each <|end_header_id|> tag. This PR fixes it.

This will affect the leaderboard score for modell meta-llama/Meta-Llama-3-8B-Instruct and meta-llama/Meta-Llama-3-70B-Instruct.

@HuanzhiMao HuanzhiMao added the BFCL-General General BFCL Issue label Sep 7, 2024
@HuanzhiMao HuanzhiMao marked this pull request as ready for review September 7, 2024 23:13
Copy link
Collaborator

@Fanjia-Yan Fanjia-Yan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@HuanzhiMao HuanzhiMao mentioned this pull request Sep 8, 2024
@ShishirPatil ShishirPatil merged commit cddb4af into ShishirPatil:main Sep 9, 2024
@HuanzhiMao HuanzhiMao deleted the fix-llama branch September 9, 2024 20:19
ShishirPatil pushed a commit that referenced this pull request Sep 15, 2024
…627, #635, and #638. (#639)

This PR updates the leaderboard to reflect the change in score due to
the following PR merge:

1. #608
2. #600
3. #616 
4. #623
5. #626
6. #627
7. #635 
8. #638

---------

Co-authored-by: Charlie Cheng-Jie Ji <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
BFCL-General General BFCL Issue
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants