Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Release SCOPE on Hugging Face #1

Open
NielsRogge opened this issue Dec 24, 2024 · 2 comments
Open

Release SCOPE on Hugging Face #1

NielsRogge opened this issue Dec 24, 2024 · 2 comments

Comments

@NielsRogge
Copy link

Hello @Linking-ai 🤗 I'm Niels and work as part of the open-source team at Hugging Face. I discovered your work through Hugging Face's daily papers as yours got featured: https://huggingface.co/papers/2412.13649. The paper page lets people discuss about your paper and lets them find artifacts about it (your models for instance), you can also claim the paper as yours which will show up on your public profile at HF. I noticed the Github repo is present, but the code is not available yet. Looking forward to the code and model release. Would you like to host the model you've pre-trained on https://huggingface.co/models? Hosting on Hugging Face will give you more visibility/enable better discoverability. We can add tags in the model cards so that people find the models easier, link it to the paper page, etc. If you're down, leaving a guide here. If it's a custom PyTorch model, you can use the PyTorchModelHubMixin class which adds from_pretrained and push_to_hub to the model which lets you to upload the model and people to download and use models right away. If you do not want this and directly want to upload model through UI or however you want, people can also use hf_hub_download. After uploaded, we can also link the models to the paper page (read here) so people can discover your model. You can also build a demo for your model on Spaces, we can provide you an A100 grant. Let me know if you're interested/need any guidance. Kind regards, Niels

@Linking-ai
Copy link
Owner

Hello Niels🤗 Thank you for your attention to our work!Our approach primarily focuses on inference methods, where the backbone model we use is based on open-sourced pre-trained models in Hugging Face (e.g., Llama 3.1: Llama-3.1-8B-Instruct). We have made modifications to the attention-related components of the Transformers library, but we do not involve any additional training of models or weights.

We will definitely take your suggestion into consideration and update the corresponding Hugging Face page for our paper: https://huggingface.co/papers/2412.13649, as well as provide more details on Hugging Face to make it easier for users to discover and link our GitHub repository to the paper page.

Once again, thank you for your support and guidance. We truly appreciate the Hugging Face community!

@NielsRogge
Copy link
Author

Thanks for your clarification @Linking-ai !

If you have any upcoming works, feel free to let us know :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants