Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[E:onnxruntime:, qnn_execution_provider.cc:591 GetCapability] QNN SetupBackend failed qnn_backend_manager.cc:334 InitializeBackend Failed to initialize backend #21157

Open
June1124 opened this issue Jun 25, 2024 · 4 comments
Labels
ep:QNN issues related to QNN exeution provider platform:mobile issues related to ONNX Runtime mobile; typically submitted using template stale issues that have not been addressed in a while; categorized by a bot

Comments

@June1124
Copy link

June1124 commented Jun 25, 2024

Describe the issue

yolov8-pose inference using onnxruntime, EP for GPUs, QUALCOMM 8155
image

To reproduce

std::wstring widestr = std::wstring(model_path.begin(), model_path.end());
qnn_options["backend_path"] = "/data/local/tmp/libQnnGpu.so";//libQnnHtp.so

// qnn_options["profiling_level"] = "basic";//QNN profiling level, options: 'basic', 'detailed', default 'off'.
// qnn_options["htp_performance_mode"] = "sustained_high_performance";//QNN performance mode, options: 'burst', 'balanced', 'default', 'high_performance', 'high_power_saver', 'low_balanced', 'extreme_power_saver', 'low_power_saver', 'power_saver', 'sustained_high_performance'. Default to 'default'.
sessionOptions.AppendExecutionProvider("QNN", qnn_options);

ort_session = new Session(env, model_path.c_str(), sessionOptions);
size_t numInputNodes = ort_session->GetInputCount();
size_t numOutputNodes = ort_session->GetOutputCount();

Urgency

No response

Platform

Android

OS Version

9.0

ONNX Runtime Installation

Built from Source

Compiler Version (if 'Built from Source')

31

Package Name (if 'Released Package')

None

ONNX Runtime Version or Commit ID

1.17.0

ONNX Runtime API

C++/C

Architecture

ARM64

Execution Provider

SNPE, Other / Unknown

Execution Provider Library Version

QNN

@June1124 June1124 added the platform:mobile issues related to ONNX Runtime mobile; typically submitted using template label Jun 25, 2024
@June1124 June1124 changed the title [E:onnxruntime:, qnn_execution_provider.cc:591 GetCapability] QNN SetupBackend failed qnn_backend_manager.cc:56 GetQnnInterfaceProvider Unable to load backend, error: dlopen failed: couldn't map "/data/local/tmp/libQnnGpu.so" segment 2: Permission denied [E:onnxruntime:, qnn_execution_provider.cc:591 GetCapability] QNN SetupBackend failed qnn_backend_manager.cc:334 InitializeBackend Failed to initialize backend Jun 25, 2024
@jywu-msft
Copy link
Member

Have you tried OnnxRuntime 1.18.0 ?
And which QNN SDK version are you using?

@jywu-msft jywu-msft added the ep:QNN issues related to QNN exeution provider label Jun 25, 2024
@June1124
Copy link
Author

Have you tried OnnxRuntime 1.18.0 ? And which QNN SDK version are you using?

QNN SDK:2.18.0.240101
I confirmed the onnxruntime release notes that onnx 1.17.0 supports qnn 2.18.
image

@June1124
Copy link
Author

Have you tried OnnxRuntime 1.18.0 ? And which QNN SDK version are you using?

Do I need to test in onnxruntime 1.18.0?

Copy link
Contributor

This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.

@github-actions github-actions bot added the stale issues that have not been addressed in a while; categorized by a bot label Jul 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ep:QNN issues related to QNN exeution provider platform:mobile issues related to ONNX Runtime mobile; typically submitted using template stale issues that have not been addressed in a while; categorized by a bot
Projects
None yet
Development

No branches or pull requests

2 participants