Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Getting 404 when talking to InferenceService #2007

Closed
Hmr-ramzi opened this issue Sep 7, 2021 · 3 comments
Closed

Getting 404 when talking to InferenceService #2007

Hmr-ramzi opened this issue Sep 7, 2021 · 3 comments

Comments

@Hmr-ramzi
Copy link

Version: 1.4.0-rc.0

While trying out this example: https://github.com/kubeflow/kfserving/tree/master/docs/samples/v1beta1/torchserve

The inferenceservice was deployed successfully (Ready = True) for the inferenceservice and also the knative service.

I am using knative-serving/knative-ingress-gateway which is wiring the istio: ingressgateway on port 80

image

Once i try:

curl -v -H "Host: torchlocal.namespace.svc.cluster.local" http://istio-ingressgateway:80/v1/models/mnist:predict -d @./mnist.json i get 404 as a reply and i also see this error in the istio-ingressgateway logs:

image

@kimwnasptd
Copy link
Member

A little bit late, but @Hmr-ramzi if you still have a cluster with this issue could you check if it's the same problem with #2082?

I also managed to bump into a case where all InferenceServices return 404 errors

@stale
Copy link

stale bot commented Apr 16, 2022

This issue has been automatically marked as stale because it has not had recent activity. It will be closed in one week if no further activity occurs. Thank you for your contributions.

@stale
Copy link

stale bot commented Apr 29, 2022

This issue has been closed due to inactivity.

@stale stale bot closed this as completed Apr 29, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants