Skip to content

Releases: OpenProteinAI/flash-attention

v0.2.8-3-causal-prefix

21 Mar 21:05
Compare
Choose a tag to compare

v0.2.8-2-causal-prefix

31 Jan 18:53
Compare
Choose a tag to compare
compile for py38 as well

v0.2.8-1-causal-prefix

30 Jan 19:17
Compare
Choose a tag to compare
optimize by ignoring unattended tokens

v0.2.8-causal-prefix

27 Jan 15:54
7f9d74e
Compare
Choose a tag to compare
bump version to 0.2.8

v0.1.3-causal-prefix

14 Dec 22:22
Compare
Choose a tag to compare
merge with upstream e.g. includes in place rotary

python38

05 Dec 16:12
Compare
Choose a tag to compare
added python 38

py-38

05 Dec 15:56
Compare
Choose a tag to compare
added cuda 11 vers

p38

05 Dec 13:18
Compare
Choose a tag to compare
p38
added p38 whl

opencuda

05 Dec 15:32
Compare
Choose a tag to compare
added matrix

multipython

05 Dec 14:09
Compare
Choose a tag to compare
add py38 and torch versions