diff --git a/README.md b/README.md index 96df7ba..87b7bb2 100644 --- a/README.md +++ b/README.md @@ -38,12 +38,11 @@ pip install flash-attn --no-build-isolation
Have trouble installing flash attention? -You can create a clean conda env and install the following CUDA dev tools: +In a clean conda env, install the `cuda-toolkit 12.3.0` for compilation: ```bash -conda install conda-forge::cudatoolkit-dev -y conda install nvidia/label/cuda-12.3.0::cuda-toolkit -y ``` -Then you can install pytorch and flash attention: +Then install pytorch and flash attention: ``` # Install a torch version that you like. pip install torch==2.1.0 --index-url https://download.pytorch.org/whl/cu121