-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OpenVINO: Encountered unknown exception in Run() #20069
Comments
Having the same error using this model: https://huggingface.co/SmilingWolf/wd-swinv2-tagger-v3 Using Command:
On CPU with OpenVINO
On an Intel ARC A770 (
On an AMD RX 7900 XTX
Same script runs fine on |
Yes there is a regression Binary DLL uploaded in github.com/intel/onnxruntime for 1.17.1 onnxruntime-openvino is compatible only with OpenVINO 2023.3.0. |
I tried buıilding from the main branch (commit id: a2998e5) and it runs fine now. Had to use OpenVINO 2023.3 since building with 2024.0 segfaulted on import. Ran into the same issue as intel/neural-speed#188 and had to add Build command:
|
To clarify, the issue is occurring with 2023.3.0. The pattern I'm seeing is that it works on CPUs with Iris Xe graphics, but not on CPUs with UHD graphics. |
Is it possible to DOWNGRADE to 1.15.0 onnxruntime-openvino with OpenVINO 2023.1.0? If yes, how with docker compose (sorry but I"m not an expert) Thanks |
I have to confirm, that this is an existing issue that breaks OpenVINO using Intel Arc (770) under Windows. @shummo A downgrade to |
Hi @mertalev, I have tested your model this and it's inferencing successfully on both Windows 11 and Ubuntu 22.04 for CPU and GPU. I would recommend you and community towards using latest OpenVINO Toolkit v2024.1 and OpenVINO EP v1.18.0 which will be available soon in the upcoming ONNXRuntime release. You can also build and find OpenVINO EP from source for the same. |
Thanks for the testing and update! We'll upgrade to 1.18.0 and 2024.1.0 once the former is available. When you mention that it works on GPU, can you clarify if you tested with an iGPU like UHD Graphics, or a dGPU like Arc (and I believe Iris Xe is also counted as a dGPU). iGPUs struggle, but I haven't seen anyone with a dGPU have an issue with this model. |
It's tested on a Meteor Lake architecture processor CPU (Intel Core Ultra 7 1003H) comprising of iGPU (Intel Arc Graphics). I'd recommend you to try your application on multiple platforms. As suggested above, you can also use the main branch of this repository to build OpenVINO EP to get the latest wheels for your work environment and reproduce the same. |
Thanks for the update!
Hi, could you please add Intel UHD graphics to your list of test cases/testing setup because it is currently broken on that (but not XE graphics, so if it works on your testing setup, this may still be broken) |
@ankitm3k No, onnxruntime-openvino does not work with latest OpenVino 2024.1 |
I have tested your model with our C++ onnxruntime_perf_test app built from source and it runs inference successfully with below machine configurations - Machine 2 - I'd recommend you to use either intel's repo master or rel-1.18.0 (https://github.com/intel/onnxruntime.git) or directly from Microsoft's master branch (https://github.com/microsoft/onnxruntime.git) or rel-1.18.0 (https://github.com/microsoft/onnxruntime/commits/rel-1.18.0/) for building the wheels using below command - Discover the wheel in below example path and install using pip - |
Doesn't onnxruntime default to the dgpu if one is provided? So it will run on the A380 (working) instead of the UHD 770 igpu (not working) I can try substituting 1.18 to see if that fixes anything regardless. |
The onnxruntime default device_type for GPU is iGPU (GPU.0) and if you want to explicitly use dGPU (GPU.1) then set your device _type as GPU.1 during your inference provider options. Please build your onnxruntime-openvino wheels from the main branch and install it in your python virtual env so that you can get the latest release changes. This should solve your issues. |
This particular issue has been fixed as far as I can tell. |
Describe the issue
When using OpenVINO, the session can be created, but calling
run
leads to the error:RuntimeException: [ONNXRuntimeError] RUNTIME_EXCEPTION: Encountered unknown exception
. Based on reports in this issue, there seems to be a pattern with the N100 CPU in particular.This seems to be a regression as this error only appears after upgrading to 1.17.1 of onnxruntime-openvino with OpenVINO 2023.3.0. This model worked when using 1.15.0 and OpenVINO 2023.1.0.
After enabling the following environmental variables:
There are a few additional logs, but none that seem pertinent:
To reproduce
With onnxruntime-openvino 1.17.1 and OpenVINO 2023.3.0, create a session including the following providers:
And the following provider options:
Then attempt to run inference with this model. It may or may not work depending on the CPU.
You may use this image to have the exact software environment producing the issue:
ghcr.io/immich-app/immich-machine-learning@sha256:01799596c7f40495887d4027df1c0f4c144c7cd6ab34937ef2cc14d246470095
Urgency
No response
Platform
Linux
OS Version
Ubuntu 22.04
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
1.17.1
ONNX Runtime API
Python
Architecture
X64
Execution Provider
OpenVINO
Execution Provider Library Version
2023.3.0
The text was updated successfully, but these errors were encountered: