Releases: Dao-AILab/flash-attention
Releases · Dao-AILab/flash-attention
v2.5.5
Bump to v2.5.5
v2.5.4
Bump to v2.5.4
v2.5.3
Bump to v2.5.3
v2.5.2
Bump to v2.5.2
v2.5.1.post1
[CI] Install torch 2.3 using index
v2.5.1
Bump to v2.5.1
v2.5.0
Bump to v2.5.0
v2.4.3.post1
[CI] Fix CUDA 12.2.2 compilation
v2.4.3
Bump to v2.4.3
v2.4.2
Bump to v2.4.2