Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

make it compitable with other torch devices which support fp16 dtype #2668

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

python279
Copy link

  1. specify dtype for specific devices
  2. convert the depth to float after model forward

2. convert the depth to float after model forward
@huchenlei
Copy link
Collaborator

Can you elaborate on what this PR is trying to solve? i.e. before this PR, was there an issue with midas annotator?

@python279
Copy link
Author

Can you elaborate on what this PR is trying to solve? i.e. before this PR, was there an issue with midas annotator?

We encountered the torch aten did not support fp32 on our NPU, we set the default dtype to torch.float16 in weibui devices.py, suppose the extentions should follow the configuration in webui.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants