Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support HuggingFace's inference API #352

Merged
merged 2 commits into from
Sep 12, 2023
Merged

Conversation

viswavi
Copy link
Collaborator

@viswavi viswavi commented Sep 12, 2023

Description

To support HuggingFace's inference API, we need to expose the "api_base" parameter for LiteLLM. This PR exposes that parameter to the APIAgent.

References

https://docs.litellm.ai/docs/providers/huggingface

Blocked by

N/A

@viswavi viswavi requested review from neubig and saum7800 September 12, 2023 15:06
Copy link
Collaborator

@saum7800 saum7800 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Left one comment to understand one change. Looks good to me otherwise

@@ -93,7 +93,7 @@ def parse_from_prompt(self, prompt: str) -> None:
response: openai.ChatCompletion | Exception = (
chat_api.generate_one_completion(
parsing_prompt_for_chatgpt,
temperature=0,
temperature=0.01,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this solving a problem relating to huggingface generation needing to have >0 temp? Or is it added so that the retries have a chance of working out in case it fails initially?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes this solves the problem that huggingface generation requires a positive temperature

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe we can handle this as part of the litellm defaults? thoughts @saum7800 @viswavi

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe we can handle this as part of the litellm defaults? thoughts @saum7800 @viswavi

Can you elaborate on what you mean? We use different temperatures in different places in Prompt2Model so we would prefer not to use LiteLLM's default values (in case we want to set it to something specific).

But I think that preventing temperature of 0 for LiteLLM (or bumping 0 to 0.0001) is a good idea, since a temperature of 0 is valid for OpenAI's models.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

gotcha. Since we only pass the temperature when user sets it, i guess this is a non-issue on our end.

Thanks for the feedback!

@viswavi viswavi merged commit 779c7b9 into main Sep 12, 2023
@viswavi viswavi deleted the vijay_support_huggingface_api branch September 12, 2023 17:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants