Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Required LLM feature not found #3

Open
abaker opened this issue Oct 1, 2024 · 28 comments
Open

Required LLM feature not found #3

abaker opened this issue Oct 1, 2024 · 28 comments

Comments

@abaker
Copy link

abaker commented Oct 1, 2024

Trying to run sample on two devices (Pixel 8 Android 15 with "Enable on-device GenAI Features" enabled, Pixel 9 Pro w/ GrapheneOS w/play services & AICore installed), and I get the following error:

2024-10-01 12:18:07.410 15763-15763 EntryChoiceActivity     com.google.ai.edge.aicore.demo       E  Failed to check model availability. (Ask Gemini)
                                                                                                    com.google.ai.edge.aicore.UnknownException: AICore failed with error type 2-INFERENCE_ERROR and error code 8-NOT_AVAILABLE: Required LLM feature not found
                                                                                                    	at com.google.ai.edge.aicore.GenerativeAIException$Companion.from$java_com_google_android_apps_aicore_client_client(com.google.ai.edge.aicore:aicore@@0.0.1-exp01:7)
                                                                                                    	at com.google.ai.edge.aicore.GenerativeModel$prepareInferenceEngine$2.invokeSuspend(com.google.ai.edge.aicore:aicore@@0.0.1-exp01:9)
                                                                                                    	at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
                                                                                                    	at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:108)
                                                                                                    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
                                                                                                    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:644)
                                                                                                    	at java.lang.Thread.run(Thread.java:1012)
                                                                                                    Caused by: com.google.ai.edge.aicore.InferenceException: AICore failed with error type 2-INFERENCE_ERROR and error code 8-NOT_AVAILABLE: Required LLM feature not found
                                                                                                    	at com.google.ai.edge.aicore.GenerativeAIException$Companion.from$java_com_google_android_apps_aicore_client_client(com.google.ai.edge.aicore:aicore@@0.0.1-exp01:4)
                                                                                                    	at com.google.ai.edge.aicore.GenerativeModel.getAiFeature(com.google.ai.edge.aicore:aicore@@0.0.1-exp01:9)
                                                                                                    	at com.google.ai.edge.aicore.GenerativeModel.access$getAiFeature(com.google.ai.edge.aicore:aicore@@0.0.1-exp01:1)
                                                                                                    	at com.google.ai.edge.aicore.GenerativeModel$getAiFeature$1.invokeSuspend(Unknown Source:14)
                                                                                                    	... 5 more

Clicking "Run the demo" buttons results in a toast that "Model is unavailable yet and downloading in background"

@wilsoncastiblanco
Copy link

wilsoncastiblanco commented Oct 1, 2024

I got the same error, I think there are extra steps to install native libraries on the device since I saw this on the logcat
Unable to open libpenguin.so: dlopen failed: library "libpenguin.so" not found.

I'm running the demo on a Samsung Galaxy S24.

Note not sure if this is something that could cause the error (don't think so) but, the steps talk about verifying the release version:

Check out the AICore APK version in the Play store (under "about this app" tab) to confirm it starts with 0.thirdpartyeap

but, the beta version downloaded was 0.release.qc8650.prod_aicore_20240822.00_RC05.676804926

@MillionthOdin16
Copy link

MillionthOdin16 commented Oct 2, 2024

I have the same error messages and issue as OP. I'm running the most recent Android 15 beta on Pixel 9 pro XL. I followed the specific instructions added to the readme as well, but no luck.

aicore version 0.thirdpartyeap.prod_aicore_20240822.00_RC05.676804926
com.google.android.aicore

private compute version
1.0.release.667904832
com.google.android.as.oss

@tzhang997
Copy link
Contributor

Please ensure that you are only running the demo on Pixel 9 series device.

If you are seeing Required LLM feature not found error, please ensure that you have followed the instructions to set up your testing environment (Prerequisites & Update APKs sections especially).

@MillionthOdin16
Copy link

MillionthOdin16 commented Oct 2, 2024

If you are seeing "Required LLM feature not found" error, please ensure that you have followed the instructions to set up your testing environment (Prerequisites & Update APKs sections especially).

I followed all the steps on my Pixel 9 and made sure that the apps are the correct version, but it's unable to download the model. I don't get any indication that there is any download progress at all.

Additionally, "0.thirdpartyeap.prod_aicore_20240822.00_RC05.676804926" appears to break all AI Core functionality for existing apps such as screenshots, phone call notes, and recorder summarization. Using the beta version of AI Core doesn't even allow the downloading of models for those tools. I had to unenroll in the beta and roll back to the normal release in order to successfully redownload the models after clearing the app data.

I'm wondering if there's an issue with downloading models on the EAP version?

@SilverDestiny
Copy link

SilverDestiny commented Oct 2, 2024

If you are seeing "Required LLM feature not found" error, please ensure that you have followed the instructions to set up your testing environment (Prerequisites & Update APKs sections especially).

I followed all the steps on my Pixel 9 and made sure that the apps are the correct version, but it's unable to download the model. I don't get any indication that there is any download progress at all.

Additionally, "0.thirdpartyeap.prod_aicore_20240822.00_RC05.676804926" appears to break all AI Core functionality for existing apps such as screenshots, phone call notes, and recorder summarization. Using the beta version of AI Core doesn't even allow the downloading of models for those tools. I had to unenroll in the beta and roll back to the normal release in order to successfully redownload the models after clearing the app data.

I'm wondering if there's an issue with downloading models on the EAP version?

For Pixel 9, try to uninstall the sample app then install the sample app again to see whether it could fix the download issue (maybe reboot as well).

For other functionalities not working in EAP version, we'll fix it in the next AICore release. Thanks for reporting the issue!

@TommyPeace
Copy link

S24 Ultra

Failed to check model availability.
com.google.ai.edge.aicore.ConnectionException: AICore failed with error type 4-CONNECTION_ERROR and error code 601-BINDING_FAILURE: AiCore service failed to bind.
  	at com.google.ai.edge.aicore.GenerativeAIException$Companion.from$java_com_google_android_apps_aicore_client_client(com.google.ai.edge.aicore:aicore@@0.0.1-exp01:2)
  	at com.google.ai.edge.aicore.GenerativeModel$prepareInferenceEngine$2.invokeSuspend(com.google.ai.edge.aicore:aicore@@0.0.1-exp01:9)
  	at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
  	at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:108)
  	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
  	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:644)
  	at java.lang.Thread.run(Thread.java:1012)
  Caused by: java.lang.SecurityException: Not allowed to bind to service Intent { cmp=com.google.android.aicore/com.google.android.apps.aicore.service.multiuser.AiCoreMultiUserService }
  	at android.app.ContextImpl.bindServiceCommon(ContextImpl.java:2203)
  	at android.app.ContextImpl.bindService(ContextImpl.java:2060)
  	at android.content.ContextWrapper.bindService(ContextWrapper.java:878)
  	at com.google.android.gms.internal.aicore.zzcf.zzf(com.google.ai.edge.aicore:aicore@@0.0.1-exp01:1)
  	at com.google.android.gms.internal.aicore.zzcf.zzb(com.google.ai.edge.aicore:aicore@@0.0.1-exp01:3)
  	at com.google.android.gms.internal.aicore.zzcg.zzp(com.google.ai.edge.aicore:aicore@@0.0.1-exp01:3)
  	at com.google.android.gms.internal.aicore.zzcg.zzh(com.google.ai.edge.aicore:aicore@@0.0.1-exp01:1)
  	at com.google.android.gms.internal.aicore.zzcg.zza(com.google.ai.edge.aicore:aicore@@0.0.1-exp01:1)
  	at com.google.ai.edge.aicore.GenerativeModel.getAiFeature(com.google.ai.edge.aicore:aicore@@0.0.1-exp01:2)
  	at com.google.ai.edge.aicore.GenerativeModel.createLlmService(com.google.ai.edge.aicore:aicore@@0.0.1-exp01:2)
  	at com.google.ai.edge.aicore.GenerativeModel.access$createLlmService(com.google.ai.edge.aicore:aicore@@0.0.1-exp01:1)
  	at com.google.ai.edge.aicore.GenerativeModel$prepareInferenceEngine$2.invokeSuspend(com.google.ai.edge.aicore:aicore@@0.0.1-exp01:5)
  	at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33) 
  	at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:108) 
  	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
  	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:644) 
  	at java.lang.Thread.run(Thread.java:1012) 

@MillionthOdin16
Copy link

For Pixel 9, try to uninstall the sample app then install the sample app again to see whether it could fix the download issue (maybe reboot as well).

Yes, I've tried installing, rebooting, uninstalling and reinstalling, clearing app data for it and AI core, and as much as I can possibly think of.

I can't get to do anything other than give me an error message. The size of the EAP AI core data never exceeds 26 MB either.

@MillionthOdin16
Copy link

S24 Ultra

They've stated it only supports the pixel 9 at the moment, so I don't think it will work.

@kenkawakenkenke
Copy link

kenkawakenkenke commented Oct 6, 2024

Also confirming I see the same issue and stack trace as OP on a Pixel 9:

  • I've followed all the steps on (https://developer.android.com/ai/gemini-nano/experimental), except for the line "that you are logged in with only the account that you intend to use for testing"; I have several accounts on my device. Perhaps this makes a difference?
  • Android AI Core: 0.thirdpartyeap.prod_aicore_20240822.00_RC05.676804926
  • Private Compute Service APK: 1.0.release.658389993

Same thing for the minimal example in the "Get started" page (https://developer.android.com/ai/gemini-nano/experimental#run-inference), I get the "Required LLM feature not found" error.

@schitzoidimbolism
Copy link

Also confirming I see the same issue and stack trace as OP on a Pixel 9:

  • I've followed all the steps on (https://developer.android.com/ai/gemini-nano/experimental), except for the line "that you are logged in with only the account that you intend to use for testing"; I have several accounts on my device. Perhaps this makes a difference?
  • Android AI Core: 0.thirdpartyeap.prod_aicore_20240822.00_RC05.676804926
  • Private Compute Service APK: 1.0.release.658389993

Same thing for the minimal example in the "Get started" page (https://developer.android.com/ai/gemini-nano/experimental#run-inference), I get the "Required LLM feature not found" error.

Can you please update this thread when you get a solution? I see the same stacktrace after following all steps except dẹleting all accounts except the enrolled one as well, but assumed I was stuck because I'm on a pixel 8. Apicore beta and PCS are the correct versions and enrollment is complete.

@HuixingWong
Copy link

I also encountered this problem.

@sam43
Copy link

sam43 commented Oct 8, 2024

Encountered same issue on Google Pixel 8 Pro and stack trace are following:

I've followed all the steps on (https://developer.android.com/ai/gemini-nano/experimental), except for the line "that you are logged in with only the account that you intend to use for testing"; I have several accounts on my device. Perhaps this makes a difference?
Android AI Core: 0.thirdpartyeap.prod_aicore_20240822.00_RC05.676804926
Private Compute Service APK: 1.0.release.658389993
Same thing for the minimal example in the "Get started" page (https://developer.android.com/ai/gemini-nano/experimental#run-inference), I get the "Required LLM feature not found" error.

StackTrace::

 com.google.ai.edge.aicore.demo       E  Failed to check model availability. (Ask Gemini)
                           com.google.ai.edge.aicore.UnknownException: AICore failed with error type 2-INFERENCE_ERROR and error code 8-NOT_AVAILABLE: Required LLM feature not found
                          at com.google.ai.edge.aicore.GenerativeAIException$Companion.from$java_com_google_android_apps_aicore_client_client(com.google.ai.edge.aicore:aicore@@0.0.1-exp01:7)
                         at com.google.ai.edge.aicore.GenerativeModel$prepareInferenceEngine$2.invokeSuspend(com.google.ai.edge.aicore:aicore@@0.0.1-exp01:9)
                         at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
                         at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:108)
                         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
                         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:644)
                         at java.lang.Thread.run(Thread.java:1012)
 Caused by: com.google.ai.edge.aicore.InferenceException: AICore failed with error type 2-INFERENCE_ERROR and error code 8-NOT_AVAILABLE: Required LLM feature not found
                         at com.google.ai.edge.aicore.GenerativeAIException$Companion.from$java_com_google_android_apps_aicore_client_client(com.google.ai.edge.aicore:aicore@@0.0.1-exp01:4)
                         at com.google.ai.edge.aicore.GenerativeModel.getAiFeature(com.google.ai.edge.aicore:aicore@@0.0.1-exp01:9)
                         at com.google.ai.edge.aicore.GenerativeModel.access$getAiFeature(com.google.ai.edge.aicore:aicore@@0.0.1-exp01:1)
                         at com.google.ai.edge.aicore.GenerativeModel$getAiFeature$1.invokeSuspend(Unknown Source:14)
                         at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33) 
                         at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:108) 
                         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
                         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:644) 
                         at java.lang.Thread.run(Thread.java:1012) 

@bcdj
Copy link

bcdj commented Oct 9, 2024

For people who are encountering this issue on Pixel 9 devices, are your devices rooted? AICore is not available on rooted devcies.

@MillionthOdin16
Copy link

For people who are encountering this issue on Pixel 9 devices, are your devices rooted? AICore is not available on rooted devcies.

I'm not rooted, running most recent version of Android 15 beta.

@enduringstack
Copy link

@bcdj if device is root,is there any methods using aicore?

@mnasrallah301
Copy link

Are there any updates? I get the same error "Required LLM feature not found" despite following all instructions. It's sad that I can't test it

@SilverDestiny
Copy link

@bcdj if device is root,is there any methods using aicore?

No. AICore only works on unrooted devices.

@tzhang997
Copy link
Contributor

Thanks folks for reporting this issue. For those of you testing on Pixel 9 series devices, could you please capture a bug report and share it with us here? It will help us root cause the issue.

@gauravzero
Copy link

gauravzero commented Oct 16, 2024 via email

@asthagarg2428
Copy link

Thanks folks for reporting this issue. For those of you testing on Pixel 9 series devices, could you please capture a bug report and share it with us here? It will help us root cause the issue.

As per Gemini nano documentation there are many supported devices mentioned along with Pixel 9, what is the hard dependency of this sample project on Pixel 9?

@dhruvkaushal11
Copy link

dhruvkaushal11 commented Jan 6, 2025

@tzhang997
It is mentione din the document this should work on these devices:

Supported Devices: AICore is currently available on Pixel 9 series devices, Google Pixel 8 Series devices including Pixel 81 and Pixel 8a2, Samsung S24 Series devices, Samsung Z Fold6, Samsung Z Flip6, Realme GT 6, Motorola Edge 50 Ultra, Motorola Razr 50 Ultra, Xiaomi 14T/Pro, and Xiaomi MIX Flip.

Any update on this:
I am trying to do this on a non rooted Pixel 8 device but facing this exact issue:

AICore failed with error type 2-INFERENCE_ERROR and error code 8-NOT_AVAILABLE: Required LLM feature not found

Followed all the steps mentioned here: https://developer.android.com/ai/gemini-nano/experimental

Enabled AI core from developer settings as well. I hope this is available in India right?

@SilverDestiny
Copy link

AICore works for more devices, but AICore experimental access is only available on Pixel 9 series right now: https://developer.android.com/ai/gemini-nano/experimental#prerequisites

@dhruvpaytm
Copy link

dhruvpaytm commented Jan 6, 2025

@SilverDestiny @tzhang997 Got it and this is universally available right? not specific to geographical location asking because I currently have Pixel 8 and thinking to buy Pixel 9 to test this feature. So confirming beforehand. I am in India.

@SilverDestiny
Copy link

India should be fine. Restricted locations are not guaranteed to be working in general.

@anhtuannd
Copy link

@SilverDestiny @tzhang997 Got it and this is universally available right? not specific to geographical location asking because I currently have Pixel 8 and thinking to buy Pixel 9 to test this feature. So confirming beforehand. I am in India.

Have you enabled in Developer options > AICore > Enable on-device GenAI Features ?

@JoseAlcerreca
Copy link

I found this as well, some more logs:

 [AiCoreLlmService] Unable to initialize. MODEL_STATUS_INIT_FAILURE   elk: [AiCoreLlmService] no available feature for [103]

@anhtuannd
Copy link

I was able to run it on Pixel 8a, using guide with AICore beta and enable Developer options > AICore > Enable on-device GenAI Features.

@abaker
Copy link
Author

abaker commented Feb 7, 2025

I'm also able to run this on my Pixel 8 now

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests