Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create Google LM #398

Merged
merged 4 commits into from
Feb 17, 2024
Merged

Create Google LM #398

merged 4 commits into from
Feb 17, 2024

Conversation

CShorten
Copy link
Collaborator

Followed the Cohere LM template. It doesn't look like you can sample multiple generations in one call to the API (linked the Google API docs I'm following in the code). Outside of that, I think it is a pretty standard implementation in sync with the other LMs.

Comment:

  • Cool to get a better understanding of how history is stored as internal state to the LMs as a list[dict[str, Any]]. I would like to explore adding functionality around inspect_history next.

Additional arguments to pass to the API provider.
"""
super().__init__(model)
self.google = genai.configure(api_key="")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could it be this? setting api_key instead of ""

self.google = genai.configure(api_key=api_key)

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah great catch! Sorry about that! Updating!

self.provider = "google"
self.kwargs = {
"model_name": model,
"temperature": 0.0,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

temperature and other configuration might be convenient to update through kwargs

How about something like this?

"temperature": 0.0 if "temperature" not in kwargs else kwargs["temperature"],

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome, good call!

self,
model: str = "gemini-pro-1.0",
api_key: Optional[str] = None,
stop_sequences: list[str] = [],
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

would stop_sequences be used?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah another great catch!

@insop
Copy link
Contributor

insop commented Feb 16, 2024

Thank you so much @CShorten!

It'd be great if you could update this markdown doc as well.
docs/language_models_client.md

@okhat okhat merged commit c8405aa into stanfordnlp:main Feb 17, 2024
1 check passed
@okhat
Copy link
Collaborator

okhat commented Feb 17, 2024

This is fantastic, thank you so much @CShorten and @insop ! I'm excited about Connor's vision for when 1.5 is released!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants