Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ERROR [12/13] RUN pip install flash-attn --no-build-isolation #1229

Open
promaprogga opened this issue Sep 15, 2024 · 1 comment
Open

ERROR [12/13] RUN pip install flash-attn --no-build-isolation #1229

promaprogga opened this issue Sep 15, 2024 · 1 comment

Comments

@promaprogga
Copy link

# Use an official Python runtime as a parent image
FROM python:3.10-slim

# Set the working directory in the container
WORKDIR /app

# Copy the current directory contents into the container
COPY . /app

# Install system dependencies
RUN apt-get update && apt-get install -y \
    git \
    && rm -rf /var/lib/apt/lists/*

# Install Python dependencies
RUN pip install --upgrade pip

# Install torch and torchvision first
RUN pip install torch==2.0.1 torchvision==0.15.2

# Install remaining Python libraries
RUN pip install transformers==4.37.2 flask packaging

RUN pip install flash-attn --no-build-isolation

getting this error when I tried to build the dockerfile in linux

=> ERROR [12/13] RUN pip install flash-attn --no-build-isolation                                                  2.8s
------
 > [12/13] RUN pip install flash-attn --no-build-isolation:
0.526 Collecting flash-attn
0.833   Downloading flash_attn-2.6.3.tar.gz (2.6 MB)
1.369      ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.6/2.6 MB 7.3 MB/s eta 0:00:00
1.643   Preparing metadata (setup.py): started
2.507   Preparing metadata (setup.py): finished with status 'error'
2.510   error: subprocess-exited-with-error
2.510
2.510   × python setup.py egg_info did not run successfully.
2.510   │ exit code: 1
2.510   ╰─> [20 lines of output]
2.510       fatal: not a git repository (or any of the parent directories): .git
2.510       /tmp/pip-install-arl6lbbl/flash-attn_d2269f4c4a7f49df8a1d94c7a99ab377/setup.py:95: UserWarning: flash_attn was requested, but nvcc was not found.  Are you sure your environment has nvcc available?  If you're installing within a container from https://hub.docker.com/r/pytorch/pytorch, only images whose names contain 'devel' will provide nvcc.
2.510         warnings.warn(
2.510       Traceback (most recent call last):
2.510         File "<string>", line 2, in <module>
2.510         File "<pip-setuptools-caller>", line 34, in <module>
2.510         File "/tmp/pip-install-arl6lbbl/flash-attn_d2269f4c4a7f49df8a1d94c7a99ab377/setup.py", line 179, in <module>
2.510           CUDAExtension(
2.510         File "/usr/local/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 1048, in CUDAExtension
2.510           library_dirs += library_paths(cuda=True)
2.510         File "/usr/local/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 1179, in library_paths
2.510           if (not os.path.exists(_join_cuda_home(lib_dir)) and
2.510         File "/usr/local/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 2223, in _join_cuda_home
2.510           raise EnvironmentError('CUDA_HOME environment variable is not set. '
2.510       OSError: CUDA_HOME environment variable is not set. Please set it to your CUDA install root.
2.510
2.510
2.510       torch.__version__  = 2.0.1+cu117
2.510
2.510
2.510       [end of output]
2.510
2.510   note: This error originates from a subprocess, and is likely not a problem with pip.
2.524 error: metadata-generation-failed
2.524
2.524 × Encountered error while generating package metadata.
2.524 ╰─> See above for output.
2.524
2.524 note: This is an issue with the package mentioned above, not pip.
2.524 hint: See above for details.
------
Dockerfile:36
--------------------
  34 |     RUN pip install ninja
  35 |
  36 | >>> RUN pip install flash-attn --no-build-isolation
  37 |     # Expose the Flask port
  38 |     EXPOSE 5000
--------------------
ERROR: failed to solve: process "/bin/sh -c pip install flash-attn --no-build-isolation" did not complete successfully: exit code: 1
@tridao
Copy link
Contributor

tridao commented Sep 15, 2024

As the error message says
CUDA_HOME environment variable is not set. Please set it to your CUDA install root

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants