Skip to content

[Bug]: OpenVino tries to run on NPU instead of CPU and fails (Windows) #28746

Open
@ugurkan-syntonym

Description

@ugurkan-syntonym

OpenVINO Version

2024.6.0.0

Operating System

Windows System

Device used for inference

CPU

Framework

ONNX

Model used

No response

Issue description

I have this problem making me crazy.
I have a cpp app that I do inference with Onnxruntime enabled with Openvino execution provider. (Onnxruntime 1.20.1 , openvino 2024.6.0.0)
I have exact same code running with same versions and build same way on ubuntu 20.04 LTS works flawlessly.
On windows I set up everything . I see OpenVINO execution provider when printed with availabile providers for onnxruntime. CPU fallback of this build onnxruntime can execute code. I made very simple inference test code testing.
Appending execution provider has no problem

options["device_type"] = "CPU";
 options["precision"] = "FP32";
 std::cout << "OpenVINO device type is set to: " << options["device_type"] + options["precision"] << std::endl;
 auto what = Ort::GetAvailableProviders();
 for (auto &i : what)
 {
     std::cout << i << std::endl;
 }
try
 {session_options.AppendExecutionProvider("OpenVINO",options);}
 catch (const std::exception& e)
 {
     std::cerr << e.what() << '\n';
 }

This part doesn't give any error whatsoever. It can append execution provider ( bear in mind if DLLs, etc were problematic this part normally throws exception)

Problem is for when you do anything to do with inferencing.

For example after this part when I try to create session and model

`face_Detector = std::make_uniqueOrt::Session(env, szName, session_options);

`
This part fails.

Error is here.
2025-01-30 11:10:09.1171902 [E:onnxruntime:, inference_session.cc:2118 onnxruntime::InferenceSession::Initialize::<lambda_bb0f2733ab1b4cfce20bb4c96cca91f0>::operator ()] Exception during initialization: Exception from src\inference\src\cpp\core.cpp:265: Exception from src\inference\src\dev\plugin.cpp:97: Exception from src\plugins\intel_npu\src\plugin\src\backends.cpp:222: No available backend

I don't get it I supply CPUFP32. When I change for example CPU to NPU it fails with different error saying NPU is not available (I dont have NPU) or gpu for example samething they work correctly when they are supposed to fail correctly. But CPU is available and shouldn't fail.
I have dualboot on my machine so same processor on ubuntu code works without any problem. But since I try to port same code to Windows I got millions of issues.

I also got a machine from azure with Intel CPU to test if machine was problematic, build everything from scratch still same error.
Please help me I'm going crazy trying to figure out what's problematic.

Step-by-step reproduction

No response

Relevant log output

Issue submission checklist

  • I'm reporting an issue. It's not a question.
  • I checked the problem with the documentation, FAQ, open issues, Stack Overflow, etc., and have not found a solution.
  • There is reproducer code and related data files such as images, videos, models, etc.

Metadata

Metadata

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions