Releases: OpenProteinAI/flash-attention
Releases · OpenProteinAI/flash-attention
v0.2.8-3-causal-prefix
pytorch 2
v0.2.8-2-causal-prefix
compile for py38 as well
v0.2.8-1-causal-prefix
optimize by ignoring unattended tokens
v0.2.8-causal-prefix
bump version to 0.2.8
v0.1.3-causal-prefix
merge with upstream e.g. includes in place rotary
python38
added python 38
py-38
added cuda 11 vers
p38
added p38 whl
opencuda
added matrix
multipython
add py38 and torch versions