Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add fp16 option for NNAPI in Android classification example #105

Open
wants to merge 4 commits into
base: master
Choose a base branch
from

Conversation

freedomtan
Copy link

Some NNAPI accelerators are fp16 only. Add an option to allow fp32 on fp16 accelerators.

@googlebot googlebot added the cla: yes CLA has been signed label Oct 22, 2019
@jdduke
Copy link
Member

jdduke commented Oct 22, 2019

Can you link to a screenshot of the new UI? Thanks.

@freedomtan
Copy link
Author

device-2019-10-23-175557

@@ -34,7 +34,7 @@
*/
public ClassifierQuantizedMobileNet(Activity activity, Device device, int numThreads)
throws IOException {
super(activity, device, numThreads);
super(activity, device, numThreads, false);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

/fp16=/ false

Copy link
Member

@jdduke jdduke left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How difficult would it be to gray out or disable the switch when float isn't selected, or when NNAPI isn't selected?

@freedomtan
Copy link
Author

Thanks for the review. I don't know how difficult it is. I'll try to disable or gray out the switch. I am not familiar with Android widgets.

jdduke
jdduke previously approved these changes Oct 25, 2019
Copy link
Member

@jdduke jdduke left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@freedomtan
Copy link
Author

rebased to resolve conflict

jdduke
jdduke previously approved these changes Nov 20, 2019
Some NNAPI accelerators are fp16 only. Add an option to allow fp32 on fp16 accelerators.
enable the FP16 switch only when NNAPI and floating point model are used.
@freedomtan
Copy link
Author

rebased and resovled conflicts

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla: yes CLA has been signed ready to pull
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants