Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: OpenVino tries to run on NPU instead of CPU and fails (Windows) #28746

Open
3 tasks done
ugurkan-syntonym opened this issue Jan 30, 2025 · 6 comments
Open
3 tasks done
Assignees
Labels
bug Something isn't working support_request

Comments

@ugurkan-syntonym
Copy link

OpenVINO Version

2024.6.0.0

Operating System

Windows System

Device used for inference

CPU

Framework

ONNX

Model used

No response

Issue description

I have this problem making me crazy.
I have a cpp app that I do inference with Onnxruntime enabled with Openvino execution provider. (Onnxruntime 1.20.1 , openvino 2024.6.0.0)
I have exact same code running with same versions and build same way on ubuntu 20.04 LTS works flawlessly.
On windows I set up everything . I see OpenVINO execution provider when printed with availabile providers for onnxruntime. CPU fallback of this build onnxruntime can execute code. I made very simple inference test code testing.
Appending execution provider has no problem

options["device_type"] = "CPU";
 options["precision"] = "FP32";
 std::cout << "OpenVINO device type is set to: " << options["device_type"] + options["precision"] << std::endl;
 auto what = Ort::GetAvailableProviders();
 for (auto &i : what)
 {
     std::cout << i << std::endl;
 }
try
 {session_options.AppendExecutionProvider("OpenVINO",options);}
 catch (const std::exception& e)
 {
     std::cerr << e.what() << '\n';
 }

This part doesn't give any error whatsoever. It can append execution provider ( bear in mind if DLLs, etc were problematic this part normally throws exception)

Problem is for when you do anything to do with inferencing.

For example after this part when I try to create session and model

`face_Detector = std::make_uniqueOrt::Session(env, szName, session_options);

`
This part fails.

Error is here.
2025-01-30 11:10:09.1171902 [E:onnxruntime:, inference_session.cc:2118 onnxruntime::InferenceSession::Initialize::<lambda_bb0f2733ab1b4cfce20bb4c96cca91f0>::operator ()] Exception during initialization: Exception from src\inference\src\cpp\core.cpp:265: Exception from src\inference\src\dev\plugin.cpp:97: Exception from src\plugins\intel_npu\src\plugin\src\backends.cpp:222: No available backend

I don't get it I supply CPUFP32. When I change for example CPU to NPU it fails with different error saying NPU is not available (I dont have NPU) or gpu for example samething they work correctly when they are supposed to fail correctly. But CPU is available and shouldn't fail.
I have dualboot on my machine so same processor on ubuntu code works without any problem. But since I try to port same code to Windows I got millions of issues.

I also got a machine from azure with Intel CPU to test if machine was problematic, build everything from scratch still same error.
Please help me I'm going crazy trying to figure out what's problematic.

Step-by-step reproduction

No response

Relevant log output

Issue submission checklist

  • I'm reporting an issue. It's not a question.
  • I checked the problem with the documentation, FAQ, open issues, Stack Overflow, etc., and have not found a solution.
  • There is reproducer code and related data files such as images, videos, models, etc.
@ugurkan-syntonym ugurkan-syntonym added bug Something isn't working support_request labels Jan 30, 2025
@ugurkan-syntonym
Copy link
Author

I also verified problem by changing/removing openvino_intel_npu_plugin dll when I remove this and try to run it says device with "NPU" name is not registered in openvino runtime.

I tried to a hack to see if it was fixing ( basically copied cpu_plugin.dll) and renamed npu and it actually understands and says is used for NPU , while it contains implementation for CPU.

So please help I definitely think something is bugged here

@abhishekdubey369
Copy link

i would like to work on this , please assign it to me

@ugurkan-syntonym
Copy link
Author

I had to downgrade to 1.20.0 and vino 5.0 (2024) and downloaded dll from Intel release. My built dlls on two different platforms basically doesnt work

@Bheema-Shanker-Neyigapula

.take

Copy link
Contributor

github-actions bot commented Feb 5, 2025

Thank you for looking into this issue! Please let us know if you have any questions or require any help.

@Bheema-Shanker-Neyigapula
Copy link

Bheema-Shanker-Neyigapula commented Feb 5, 2025

Hi @ugurkan-syntonym

Manually Specify the Plugin Path
Windows may not be resolving OpenVINO’s cpu_plugin.dll properly.
Try setting the plugin path manually in session options.

session_options.AppendExecutionProvider("OpenVINO", {
    {"device_type", "CPU"},
    {"precision", "FP32"},
    {"device_id", "0"},
    {"config_path", "C:\\Program Files (x86)\\Intel\\openvino_2024\\runtime\\bin"}
});

If this doesn't work, explicitly set the OpenVINO plugin path in your environment:

$env:OPENVINO_PLUGIN_PATH="C:\Program Files (x86)\Intel\openvino_2024\runtime\bin"

Then restart your application to apply the changes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working support_request
Projects
None yet
Development

No branches or pull requests

3 participants