-
Notifications
You must be signed in to change notification settings - Fork 44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add fittable #140
base: main
Are you sure you want to change the base?
Add fittable #140
Conversation
Codecov ReportAttention: Patch coverage is
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good! Some minor comments and suggestions.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks super useful. left some comments. Also, perhaps we can add some reference to multi-label usage somewhere?
"""Save the model to a folder.""" | ||
save_pipeline(self, path) | ||
|
||
def push_to_hub(self, repo_id: str, token: str | None = None, private: bool = False) -> None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would add a modelcard and perhaps tags or a library reference, this helps a lot with visibility, usability and findability.
https://huggingface.co/docs/hub/model-cards#specifying-a-library
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This actually already happens because we push the underlying static model to the hub, which has a model card. This model card template is specified in the root of the code.
self.head = head | ||
|
||
@classmethod | ||
def from_pretrained( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can't we load it from the Hub? perhaps we should align the arguments a bit with the transformers
naming given you've also adopted from_pretrained
?
For example using pretrained_model_name_or_path
. https://huggingface.co/docs/transformers/v4.48.0/en/model_doc/auto#transformers.AutoTokenizer.from_pretrained
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
from_pretrained loads from the hub. The arguments mimic the ones from StaticModel
and, although they don't match transformers exactly, we're wary of introducing breaking changes.
No description provided.