You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
flash_attn-2.6.3+cu123torch2.4cxx11abiFalse-cp311-cp311-linux_x86_64.whl
2.6.4 and FA3 release .whl for CUDA 12.4 torch2.4.1 python 3.11 support?
The text was updated successfully, but these errors were encountered:
tqangxl
changed the title
release .whl for CUDA 12.4 torch2.4.1 python 3.11?
FA3 release .whl for CUDA 12.4 torch2.4.1 python 3.11?
Sep 17, 2024
tqangxl
changed the title
FA3 release .whl for CUDA 12.4 torch2.4.1 python 3.11?
2.6.4 and FA3 release .whl for CUDA 12.4 torch2.4.1 python 3.11?
Sep 17, 2024
flash_attn-2.6.3+cu123torch2.4cxx11abiFalse-cp311-cp311-linux_x86_64.whl
2.6.4 and FA3 release .whl for CUDA 12.4 torch2.4.1 python 3.11 support?
The text was updated successfully, but these errors were encountered: