-
Notifications
You must be signed in to change notification settings - Fork 281
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Activating environment in dockerfile - related to #81 #89
Comments
Can you share the contents of your |
yes:
|
I see. Your line What is your goal with |
The desired outcome is to activate the environment, such that we could run our Dash application/Flask application, with our desired version of python and dependencies. |
I have the same issue but with a different goal. I just want one environment in my dockerfile, however I want to enable the environment variables that are set in https://conda.io/docs/user-guide/tasks/build-packages/compiler-tools.html The only way to do that is using "source activate root". But the subsequent Dockerfile commands don't pick that up. For example if I do ENV source activate root and then follow it with RUN pip install regex...I don't have the correct gcc environment variables set. In a lot of ways, this is tantamount to asking - can I activate a conda env at boot and have it available everywhere |
Once release conda 4.6, you can make use of |
its not just in the entrypoint, its actually while building the dockerfile itself. for example, today I have to do before running subsequent pip install commands inside the Dockerfile itself. Because otherwise my pip installs will not pick up the new gcc. |
@sandys One possible workaround for your installation problem in the Dockerfile is to run the source activate and pip installs in one single /bin/bash command.
This at least solved the installation problem for me. |
@kalefranz yes, this is my issue as well. Is there any solution for this? |
@JakartaLaw @kalefranz I am also facing the same issue |
one could try using the Just an idea, I didn't have the time to test myself. |
I had to do stuff like this:
Then, I also had to define specific scripts that did similar things: somescript.sh
Then, in the Dockerfile, I copied these scripts into the image and I use them in my
I battled this for a long time today and this seems to work. |
erewok's solution works for me too, thanks! |
The way I do it in my dockerfile is as follows (Source: https://medium.com/@chadlagore/conda-environments-with-docker-82cdc9d25754) Optional setting shell to bash
create your conda env
activate myenv and work in this environment
|
@nsarode-joyn this worked for me thanks |
@nsarode-joyn Thanks! You saved my day :) |
@nsarode-joyn solution works for me only when I run the docker container in a bash (i.e. docker run -it <docker_image> bash) |
So to simplify: How do I activate a conda environment for the duration of a Dockerfile? i.e. if I say |
@rchossein Did you ever solve this problem? I am not able to run it using |
This is similar to the solutions above, but avoids some of the boilerplate in every RUN command:
Then something like this should work as expected:
Or, to set the environment persistently (including for an interactive shell) you could:
|
You'll want to be careful when setting On a different note, setting an IMHO, the only fail-safe way to generate an image is to explicitly write out all the commands generated by
and include them at the end of the Dockerfile. |
As @kalefranz mentioned,
|
Not sure if this is appropriate for adding to this issue but my problem is mirrored exactly in this thread so I thought I'd post here. I'm having the exact same error as described by @JakartaLaw. I'm trying to create a docker image that will result in container with the environment activated on run. Here are the contents of my
And the yaml file defining the environment:
Similar to @nsarode-joyn, I followed the advice in this excellent post. The dockerfile builds without problem, but when I execute
I get the following error (inside the container), and the new environment is not active:
I've gotten to the bottom of the internet in search of an answer, but can't find a solution. I really need the environment to be created, the packages specified in
Any wisdom here would be massively appreciated. |
https://github.com/jupyter/docker-stacks/pull/973/files
|
Thank you @mathematicalmichael! It turns out all I needed to do was add the line: For completeness, in case anyone encounters a similar issue, here is the full file:
|
You can to the exact same thing inside docker:
The first line modifies the path and other environment variables inside the container, which are maintained in the image, so you don't need to put this command in the bashrc. Running bash as the entry point will then load the environment, so that you get a shell inside myenv. |
I tried nearly all of them, but I could not manage to activate a Conda environment on Heroku container. If someone already did, please type it. |
Use
|
Another thing, creating an environment inside the docker image increases the size, you can just use the base env |
have you tried the instructions here?: |
@nicornk You might check out this article: Activating a Conda environment in your Dockerfile tl;dr # The code to run when container is started:
COPY run.py .
ENTRYPOINT ["conda", "run", "-n", "myenv", "python", "run.py"] |
@sterlinm Thanks for your reply. I saw that article already and tried it out but there seems to be a difference how the output is streamed / flushed to the console. For example, if we have the following
and use the following run.sh:
and the following RUN command in the dockerfile:
When I use the following RUN command in the dockerfile:
So, there seems to be a fundamental difference between using Any idea? |
I don't think there is a (simple) way to do that. Conda makes modifications to many environment variables and search paths, and in order to reflect this behaviour inside a container, you either need to make all those changes by hand or let conda do it - which means you have to run a shell and start your program inside it. You can do that by using
The observation seems right. In the first example, docker pipes to stdout of the bash script directly. In the second example, the output of the python program is buffered before piping it to stdout. The main difference is, in the second example, no shell is executed. Either python itself or conda could be the reason for that [[in fact I've seen similar issues with python before]]. Maybe try running the python program in a container that has all dependencies installed via pip or repository? |
My assumption is that |
@nicornk I hadn't noticed that issue with how it buffers. I'd seen that conda run is considered experimental, but it's been considered experimental for years it seems. My impression is that all of the strategies for activating conda inside of docker are a bit wonky so you have to pick your poison. There's an open issue for allowing conda run to avoid buffering stdout, and somebody there came up with a trick that seems to work.
Now, this doesn't quite get you to having For what it's worth, I tend to use a shell script that sources the conda.sh file and activates the environment. |
Here is the code that has worked for me, which login as non-root user, copy some files to the container, and use base environment, and run an application.
|
For those trying to use conda in a non-interactive shell within your container, most (seemingly all) of the instructions in this issue won't help you (although they will help with interactive shell). See conda/conda#7980 for more information. In that case, adding this to your bash script makes it work:
|
Is there a solution that also reliably works with docker compose? |
`ARG CUDA_VERSION=10.2 ENV PATH ${CUDA_PATH}/bin:$PATH ENV DEBIAN_FRONTEND=noninteractive Set localeENV LANG C.UTF-8 LC_ALL=C.UTF-8 #更新源 #Install Anaconda Set conda nameENV CONDA_ENV_NAME faceformer run error CommandNotFoundError: Your shell has not been properly configured to use 'conda activate'.
or, for all users, enable conda with
The options above will permanently enable the 'conda' command, but they do NOT
in your terminal, or to put the base environment on PATH permanently, run
Previous to conda 4.4, the recommended way to activate conda was to modify PATH in
^^^ The above line should NO LONGER be in your ~/.bashrc file! ^^^ The command '/bin/bash -c echo ". /opt/conda/etc/profile.d/conda.sh" >> ~/.bashrc && echo "conda activate faceformer" >> ~/.bashrc && conda activate $CONDA_ENV_NAME && conda install pytorch==1.9.0 torchvision==0.10.0 torchaudio==0.9.0 cudatoolkit=10.2 -c pytorch && pip install -r requirements.txt' returned a non-zero code: 1 |
You saved me! |
IT'S TRUE!! I encountered an error with installing x11-common due to "ENV BASH_ENV ~/.bashrc" command when conducting a docker image. |
We are building a docker image based on the miniconda3:latest image. The Dockerfile is the following:
FROM continuumio/miniconda3:latest
COPY environment.yml /home/files/environment.yml
RUN conda env create -f /home/files/environment.yml
RUN conda activate webapp
Where webapp is the name of the environment. However we get the error message:
CommandNotFoundError: Your shell has not been properly configured to use 'conda activate'. If your shell is Bash or a Bourne variant, enable conda for the current user with
This seems to be related to #81.
We see however in the Dockerfile for miniconda3, that this commando is already run.
If run
docker container run -it
then we can runconda activate webapp
? Are we missing something?The text was updated successfully, but these errors were encountered: