Skip to content

Releases: Dao-AILab/flash-attention

v2.5.5

21 Feb 23:59
Compare
Choose a tag to compare
Bump to v2.5.5

v2.5.4

21 Feb 00:32
Compare
Choose a tag to compare
Bump to v2.5.4

v2.5.3

10 Feb 09:09
Compare
Choose a tag to compare
Bump to v2.5.3

v2.5.2

31 Jan 10:46
Compare
Choose a tag to compare
Bump to v2.5.2

v2.5.1.post1

30 Jan 22:34
Compare
Choose a tag to compare
[CI] Install torch 2.3 using index

v2.5.1

30 Jan 05:07
Compare
Choose a tag to compare
Bump to v2.5.1

v2.5.0

23 Jan 07:41
Compare
Choose a tag to compare
Bump to v2.5.0

v2.4.3.post1

22 Jan 01:24
Compare
Choose a tag to compare
[CI] Fix CUDA 12.2.2 compilation

v2.4.3

22 Jan 01:15
Compare
Choose a tag to compare
Bump to v2.4.3

v2.4.2

26 Dec 00:29
Compare
Choose a tag to compare
Bump to v2.4.2