-
Notifications
You must be signed in to change notification settings - Fork 6.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Listing PyTorch Models in AWS Marketplace using Torch Serve #1573
Conversation
Authored-by: Rick Cao <[email protected]>
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
@@ -0,0 +1,727 @@ | |||
{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please do not upgrade people's kernel with SageMaker v2 as this will make all other v1 notebooks fail.
If you require v2, please upgrade to it, then roll it back in the notebook's cleanup section.
Reply via ReviewNB
I see there's a lot of setup in here involving building a torchserve container. Why not just use one that's already built for this? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We use the raw torchserve because our sellers may need to write their own handlers for special models. Our notebook serves an end-to-end template for them to publish the product.
The sagemaker-pytorch-inference-toolkit is cool but it also requires sellers to dig into additional information. Our notebook set-up requires least knowledge burden on sellers. Also we have a blog for the instruction as well.
We use the raw torchserve because our sellers may need to write their own handlers for special models. Our notebook serves an end-to-end template for them to publish the product. The sagemaker-pytorch-inference-toolkit is cool but it also requires sellers to dig into additional information. Our notebook set-up requires least knowledge burden on sellers. Also we have a blog for the instruction as well. |
Ok, we also have pre-built Deep Learning Containers and PyTorch 1.6 and above has built in Torchserve support. You'd be able to inject custom models and handlers with those. Something to look into at least? |
I have some code in this notebook that sets up v2 and rolls it back at the end. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Checking this again... still have SageMaker version issues with an upgrade call in there that will break the customer's kernel for all other notebooks.
"outputs": [], | ||
"source": [ | ||
"!pip install --upgrade pip\n", | ||
"!pip -q install sagemaker awscli boto3 --upgrade " |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please remove this and replace with pinning to a specific version that works...
Authored-by: Rick Cao [email protected]
Issue #, if available:
Description of changes:
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.