Releases: mjun0812/flash-attention-prebuild-wheels
Releases · mjun0812/flash-attention-prebuild-wheels
v0.0.5
Flash-Attention |
Python |
PyTorch |
CUDA |
2.6.3, 2.7.4.post1 |
3.10, 3.11, 3.12 |
2.0.1, 2.1.2, 2.2.2, 2.3.1, 2.4.1, 2.5.1, 2.6.0 |
12.4.1, 12.6.3 |
v0.0.4
Flash-Attention |
Python |
PyTorch |
CUDA |
2.7.3 |
3.10, 3.11, 3.12 |
2.0.1, 2.1.2, 2.2.2, 2.3.1, 2.4.1, 2.5.1 |
11.8.0, 12.1.1, 12.4.1 |
v0.0.3
Flash-Attention |
Python |
PyTorch |
CUDA |
2.7.2.post1 |
3.10, 3.11, 3.12 |
2.0.1, 2.1.2, 2.2.2, 2.3.1, 2.4.1, 2.5.1 |
11.8.0, 12.1.1, 12.4.1 |
v0.0.2
Flash-Attention |
Python |
PyTorch |
CUDA |
2.4.3, 2.5.6, 2.6.3, 2.7.0.post2 |
3.10, 3.11, 3.12 |
2.0.1, 2.1.2, 2.2.2, 2.3.1, 2.4.1, 2.5.1 |
11.8.0, 12.1.1, 12.4.1 |
v0.0.1
Flash-Attention |
Python |
PyTorch |
CUDA |
1.0.9, 2.4.3, 2.5.6, 2.5.9, 2.6.3 |
3.10, 3.11, 3.12 |
2.0.1, 2.1.2, 2.2.2, 2.3.1, 2.4.1, 2.5.0 |
11.8.0, 12.1.1, 12.4.1 |
v0.0.0
- Flash-Attention
- Python
- PyTorch
- 2.0.1
- 2.1.2
- 2.2.2
- 2.3.1
- 2.4.1
- 2.5.0
- CUDA