-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: OpenVino tries to run on NPU instead of CPU and fails (Windows) #28746
Comments
I also verified problem by changing/removing openvino_intel_npu_plugin dll when I remove this and try to run it says device with "NPU" name is not registered in openvino runtime. I tried to a hack to see if it was fixing ( basically copied cpu_plugin.dll) and renamed npu and it actually understands and says is used for NPU , while it contains implementation for CPU. So please help I definitely think something is bugged here |
i would like to work on this , please assign it to me |
I had to downgrade to 1.20.0 and vino 5.0 (2024) and downloaded dll from Intel release. My built dlls on two different platforms basically doesnt work |
.take |
Thank you for looking into this issue! Please let us know if you have any questions or require any help. |
Manually Specify the Plugin Path session_options.AppendExecutionProvider("OpenVINO", {
{"device_type", "CPU"},
{"precision", "FP32"},
{"device_id", "0"},
{"config_path", "C:\\Program Files (x86)\\Intel\\openvino_2024\\runtime\\bin"}
}); If this doesn't work, explicitly set the OpenVINO plugin path in your environment: $env:OPENVINO_PLUGIN_PATH="C:\Program Files (x86)\Intel\openvino_2024\runtime\bin" Then restart your application to apply the changes. |
OpenVINO Version
2024.6.0.0
Operating System
Windows System
Device used for inference
CPU
Framework
ONNX
Model used
No response
Issue description
I have this problem making me crazy.
I have a cpp app that I do inference with Onnxruntime enabled with Openvino execution provider. (Onnxruntime 1.20.1 , openvino 2024.6.0.0)
I have exact same code running with same versions and build same way on ubuntu 20.04 LTS works flawlessly.
On windows I set up everything . I see OpenVINO execution provider when printed with availabile providers for onnxruntime. CPU fallback of this build onnxruntime can execute code. I made very simple inference test code testing.
Appending execution provider has no problem
This part doesn't give any error whatsoever. It can append execution provider ( bear in mind if DLLs, etc were problematic this part normally throws exception)
Problem is for when you do anything to do with inferencing.
For example after this part when I try to create session and model
`face_Detector = std::make_uniqueOrt::Session(env, szName, session_options);
`
This part fails.
Error is here.
2025-01-30 11:10:09.1171902 [E:onnxruntime:, inference_session.cc:2118 onnxruntime::InferenceSession::Initialize::<lambda_bb0f2733ab1b4cfce20bb4c96cca91f0>::operator ()] Exception during initialization: Exception from src\inference\src\cpp\core.cpp:265: Exception from src\inference\src\dev\plugin.cpp:97: Exception from src\plugins\intel_npu\src\plugin\src\backends.cpp:222: No available backend
I don't get it I supply CPUFP32. When I change for example CPU to NPU it fails with different error saying NPU is not available (I dont have NPU) or gpu for example samething they work correctly when they are supposed to fail correctly. But CPU is available and shouldn't fail.
I have dualboot on my machine so same processor on ubuntu code works without any problem. But since I try to port same code to Windows I got millions of issues.
I also got a machine from azure with Intel CPU to test if machine was problematic, build everything from scratch still same error.
Please help me I'm going crazy trying to figure out what's problematic.
Step-by-step reproduction
No response
Relevant log output
Issue submission checklist
The text was updated successfully, but these errors were encountered: