Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
pengzhangzhi authored Dec 22, 2024
1 parent e67dee2 commit 38c6a7f
Showing 1 changed file with 2 additions and 3 deletions.
5 changes: 2 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,12 +38,11 @@ pip install flash-attn --no-build-isolation
<details>
<summary>Have trouble installing flash attention?</summary>

You can create a clean conda env and install the following CUDA dev tools:
In a clean conda env, install the `cuda-toolkit 12.3.0` for compilation:
```bash
conda install conda-forge::cudatoolkit-dev -y
conda install nvidia/label/cuda-12.3.0::cuda-toolkit -y
```
Then you can install pytorch and flash attention:
Then install pytorch and flash attention:
```
# Install a torch version that you like.
pip install torch==2.1.0 --index-url https://download.pytorch.org/whl/cu121
Expand Down

0 comments on commit 38c6a7f

Please sign in to comment.