-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Create Google LM #398
Create Google LM #398
Conversation
dsp/modules/google.py
Outdated
Additional arguments to pass to the API provider. | ||
""" | ||
super().__init__(model) | ||
self.google = genai.configure(api_key="") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
could it be this? setting api_key
instead of ""
self.google = genai.configure(api_key=api_key)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah great catch! Sorry about that! Updating!
dsp/modules/google.py
Outdated
self.provider = "google" | ||
self.kwargs = { | ||
"model_name": model, | ||
"temperature": 0.0, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
temperature
and other configuration might be convenient to update through kwargs
How about something like this?
"temperature": 0.0 if "temperature" not in kwargs else kwargs["temperature"],
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Awesome, good call!
dsp/modules/google.py
Outdated
self, | ||
model: str = "gemini-pro-1.0", | ||
api_key: Optional[str] = None, | ||
stop_sequences: list[str] = [], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
would stop_sequences
be used?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah another great catch!
Thank you so much @CShorten! It'd be great if you could update this markdown doc as well. |
Followed the Cohere LM template. It doesn't look like you can sample multiple generations in one call to the API (linked the Google API docs I'm following in the code). Outside of that, I think it is a pretty standard implementation in sync with the other LMs.
Comment:
history
is stored as internal state to the LMs as alist[dict[str, Any]]
. I would like to explore adding functionality around inspect_history next.