-
Notifications
You must be signed in to change notification settings - Fork 178
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Colab demo #289
Comments
I would love that! |
Yes that would be great! |
This has been created!
@360macky and @KabaTubare, if you have a chance to try it out and have any feedback we'd be happy to hear it! |
Great! Is there a provision to store the model after fine tuning for future
deployment in a production environment?
…On Tue, Aug 29, 2023 at 10:14 PM Graham Neubig ***@***.***> wrote:
This has been created!
- Colab:
https://colab.research.google.com/github/neulab/prompt2model/blob/main/prompt2model_demo.ipynb
- Github:
https://github.com/neulab/prompt2model/blob/main/prompt2model_demo.ipynb
@360macky <https://github.com/360macky> and @KabaTubare
<https://github.com/KabaTubare>, if you have a chance to try it out and
have any feedback we'd be happy to hear it!
—
Reply to this email directly, view it on GitHub
<#289 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/A7AVNVSKT3BV3KGKTWNJZ4LXXY5PXANCNFSM6AAAAAA363UWOM>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
--
Kind regards,
Troy Woodson
|
Yeah! If you run the notebook locally it will automatically be saved in the Inference can be performed in the same way as you would do inference with any Hugging Face model (e.g. using Hugging Face directly, Text Generation Inference, or VLLM). |
Thanks Graham!
…On Tue 29. Aug 2023 at 11.30 PM, Graham Neubig ***@***.***> wrote:
Yeah! If you run the notebook locally it will automatically be saved in
the trained_model and trained_tokenizer directories. If you're on colab
you can just click on the Folder icon on the left, and download the models.
Inference can be performed in the same way as you would do inference with
any Hugging Face model (e.g. using Hugging Face directly, Text Generation
Inference, or VLLM).
—
Reply to this email directly, view it on GitHub
<#289 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/A7AVNVSNVQLAPQW7GBT7IATXXZGOJANCNFSM6AAAAAA363UWOM>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Hi Graham
Quick question. If none of the datasets found within HugginFace do not suit
the project, would the next section with Generated datasets from OpenAi
generate them automatically? It seems to be so but want to check before
incurring the huge API cost. Thanks a million
Br
TW
…On Tue, Aug 29, 2023 at 10:14 PM Graham Neubig ***@***.***> wrote:
This has been created!
- Colab:
https://colab.research.google.com/github/neulab/prompt2model/blob/main/prompt2model_demo.ipynb
- Github:
https://github.com/neulab/prompt2model/blob/main/prompt2model_demo.ipynb
@360macky <https://github.com/360macky> and @KabaTubare
<https://github.com/KabaTubare>, if you have a chance to try it out and
have any feedback we'd be happy to hear it!
—
Reply to this email directly, view it on GitHub
<#289 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/A7AVNVSKT3BV3KGKTWNJZ4LXXY5PXANCNFSM6AAAAAA363UWOM>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
--
Kind regards,
Troy Woodson
|
It'd be nice to have a demo of using prompt2model on colab so people can see it and play around with it more easily.
The text was updated successfully, but these errors were encountered: