-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Displays a warning when starting #136
Comments
Hi, this is likely because you have an old GPU that doesn't support Flash Attention. If the image or video prediction works for you, you can ignore this warning. If there is an error, we have recently added fallback to all available kernels (in #155) as a workaround to this problem. You can pull the latest code via |
Good evening. I have a 3080ti which should allow this, but I am getting this same error:
|
im getting the same error:
|
Heck, I have a 4090 with this. :( |
D:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-segment-anything-2\sam2\modeling\sam\transformer.py:20: UserWarning: Flash Attention is disabled as it requires a GPU with Ampere (8.0) CUDA capability.
OLD_GPU, USE_FLASH_ATTN, MATH_KERNEL_ON = get_sdpa_settings()
The text was updated successfully, but these errors were encountered: