Skip to content

Commit

Permalink
docker GH action
Browse files Browse the repository at this point in the history
  • Loading branch information
gbayarri committed Jun 5, 2024
1 parent d853753 commit 47fa83f
Show file tree
Hide file tree
Showing 4 changed files with 204 additions and 0 deletions.
29 changes: 29 additions & 0 deletions .github/workflows/docker.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
on:
push:
paths:
- common/docker/Dockerfile
workflow_dispatch:

name: Dockerfile
jobs:
sync:
runs-on: ubuntu-latest
steps:
- name: Copy Dockerfile
run: |
cp common/docker/Dockerfile biobb_wf_amber_abc_setup/docker/Dockerfile
cp common/docker/Dockerfile biobb_wf_amber_md_setup/docker/Dockerfile
cp common/docker/Dockerfile biobb_wf_amber_md_setup_lig/docker/Dockerfile
cp common/docker/Dockerfile biobb_wf_cmip/docker/Dockerfile
cp common/docker/Dockerfile biobb_wf_dna_helparms/docker/Dockerfile
cp common/docker/Dockerfile biobb_wf_flexdyn/docker/Dockerfile
cp common/docker/Dockerfile biobb_wf_flexserv/docker/Dockerfile
cp common/docker/Dockerfile biobb_wf_godmd/docker/Dockerfile
cp common/docker/Dockerfile biobb_wf_ligand_parameterization/docker/Dockerfile
cp common/docker/Dockerfile biobb_wf_md_setup/docker/Dockerfile
cp common/docker/Dockerfile biobb_wf_md_setup_mutations/docker/Dockerfile
cp common/docker/Dockerfile biobb_wf_pmx_tutorial/docker/Dockerfile
cp common/docker/Dockerfile biobb_wf_protein_md_analysis/docker/Dockerfile
cp common/docker/Dockerfile biobb_wf_protein-complex_md_setup/docker/Dockerfile
cp common/docker/Dockerfile biobb_wf_structure_checking/docker/Dockerfile
cp common/docker/Dockerfile biobb_wf_virtual-screening_fpocket/docker/Dockerfile
72 changes: 72 additions & 0 deletions common/docker/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@
# Base docker with miniconda
FROM continuumio/miniconda3

# Define working dir
WORKDIR /app

# Define a directory for the volume
VOLUME /data

# Define REPO & SUBREPO variables
ARG REPO
ARG SUBREPO

# Check if REPO variable is set
RUN if [ -z "$REPO" ]; then echo "REPO variable is not set. Cancelling build." && exit 1; fi

# Define REPOSITORY & SUBREPOSITORY environment variables
ENV REPOSITORY=$REPO
ENV SUBREPOSITORY=$SUBREPO

# Download the conda environment file (from the jupyter repo $REPO)
RUN wget https://raw.githubusercontent.com/bioexcel/$REPOSITORY/main/conda_env/environment.yml -O /app/workflow.env.yml

# Download the jupyter notebook file (the url varies depending on if the subrepository is set or not) and the pure python workflow file (from biobb_workflows)
RUN if [ -z "$SUBREPOSITORY" ]; then \
wget https://raw.githubusercontent.com/bioexcel/$REPOSITORY/main/$REPOSITORY/notebooks/$REPOSITORY.ipynb -O /app/notebook.ipynb; \
wget https://raw.githubusercontent.com/bioexcel/biobb_workflows/main/$REPOSITORY/python/workflow.py -O /app/workflow.py; \
else \
wget https://raw.githubusercontent.com/bioexcel/$REPOSITORY/main/$REPOSITORY/notebooks/$SUBREPOSITORY/${REPOSITORY}_$SUBREPOSITORY.ipynb -O /app/notebook.ipynb; \
wget https://raw.githubusercontent.com/bioexcel/biobb_workflows/main/${REPOSITORY}_$SUBREPOSITORY/python/workflow.py -O /app/workflow.py; \
fi

# Enable libmamba as solver
RUN conda config --set solver libmamba

# Create new environment
RUN conda env create -f /app/workflow.env.yml

# Define an environment variable with default value
ENV MODE=python

# Define environment variables for the user custom python script and jupyter notebook
ENV USER_PY=
ENV USER_JN=

# Expose the port for the Jupyter notebook server
EXPOSE 8888

# Set the entrypoint script as the entrypoint for the Docker image
ENTRYPOINT ["/bin/bash", "-c"]

# Run either the python script or the jupyter notebook depending on the MODE environment variable
CMD ["\
if [ \"$MODE\" = \"python\" ]; then \
if [ -n \"$USER_PY\" ]; then \
cp /data/\"$USER_PY\" /app/workflow.py; \
fi; \
mkdir -p /data/wf_python; \
cd /data/wf_python; \
conda run --no-capture-output -n $REPOSITORY python /app/workflow.py --config /data/workflow.yml; \
else \
mkdir -p /data/wf_notebook; \
if [ -n \"$USER_JN\" ]; then \
cp /data/\"$USER_JN\" /data/wf_notebook/notebook.ipynb; \
else \
cp /app/notebook.ipynb /data/wf_notebook/notebook.ipynb; \
fi; \
cd /data/wf_notebook; \
source activate $REPOSITORY; \
jupyter notebook --ip=0.0.0.0 --port=8888 --no-browser --allow-root --NotebookApp.token='' --NotebookApp.password=''; \
fi \
"]
47 changes: 47 additions & 0 deletions common/docker/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
# <a name="execute-wf"></a>Execute workflow through docker container

To execute the workflow through a docker container, please follow the next steps:

## <a name="files"></a>Workflow files

Below you can find the list of all the needed **files** for executing this workflow:

* **Dockerfile:** the file used for building a docker container with this workflow inside.
* **workflow.yml:** the configuration file with the I/O dependencies and settings for each step of the workflow.

## <a name="requirements"></a>Requirements

For executing this BioBB workflow, there is a single requirement: to have [Docker](https://docs.docker.com/engine/install/) installed in your computer. Once this requirement is fullfilled, you will be able to install the workflow.

## <a name="installation"></a>Installation

After downloading the workflow files and decompressing them in a folder, please go to this directory, open it in terminal and execute the following script:

docker build -t <container_image> .

Where **container_image** will be the name of the docker container image.

Note: if using an **ARM** architecture such as the **Apple Silicon chips**, please be sure of adding the **--platform** flag:

docker build --platform linux/amd64 -t <container_image> .

For adding image tags:

docker build --platform linux/amd64 -t <container_image>:<tag> .
docker tag <container_image>:<tag> <container_image>:latest
docker push <container_image>:<tag>
docker push <container_image>:latest

## <a name="run-wf"></a>Run workflow

After that, the only thing left is to run the workflow:

docker run -w /data -v /path/to/inputs:/data <container_image>

Where **/path/to/inputs** is the path to the folder where the input(s) and workflow.yml files are located (all of them must be in the same folder); and **container_image** is the name of the docker container image

Take into account that depending on the number of steps, the tools executed and the settings provided, along with the power of your computer, the execution of the workflow can take from a **few minutes** to **several hours**. The workflow progress will be shown in your terminal.

## <a name="get-output"></a>Get output results

Once the workflow is finished, you just should enter the new **wf_name_of_workflow** folder and, inside it, you will find a folder for each step of the workflow with all the files generated in every step.
56 changes: 56 additions & 0 deletions common/python/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
# <a name="execute-wf"></a>Execute workflow through python script

To execute the workflow through a python script, please follow the next steps:

## <a name="files"></a>Workflow files

Below you can find the list of all the needed **files** for executing this workflow:

* **workflow.py:** the python file with all the steps to execute this workflow.
* **workflow.yml:** the configuration file with the I/O dependencies and settings for each step of the workflow.
* **workflow.env.yml:** the environment file needed for create a conda environment where this workflow will be run.

## <a name="requirements"></a>Requirements

For executing a BioBB workflow in python, there is a single requirement: to have [Anaconda](https://docs.anaconda.com/anaconda/install/index.html) installed in your computer. Once this requirement is fullfilled, you will be able to install the workflow.

The BioBB's are fully compatible with **Linux** and **macOS**. For running them on **Windows 10**, you should do it through the Windows Subsystem for Linux. In the BioBB official website, [there is a tutorial](https://mmb.irbbarcelona.org/biobb/availability/tutorials/windows) explaining how to do it.

## <a name="installation"></a>Installation

After downloading the workflow files and decompressing them in a folder, please go to this directory, open it in terminal and execute the following script:

conda env create --file workflow.env.yml

This process can take a while, and once it is finished you will have an environment with **all the dependencies** needed for running this workflow. For activate this environment, please follow the instructions given by the conda installator. Just before finishing the installation, the terminal will prompt the following message:

```shell
#
# To activate this environment, use
#
# $ conda activate name_of_environment
#
# To deactivate an active environment, use
#
# $ conda deactivate
```

So execute the following script (changing name_of_environment by the name shown in your terminal):

conda activate name_of_environment

## <a name="custom-paths"></a>Custom paths

To run this workflow properly in your computer, you should open in a text/code editor the **workflow.yml** file and replace all the occurrences of **/path/to/inputs/** with the absolute path to the folder where you have decompressed the zip file downloaded in the first step.

## <a name="run-wf"></a>Run workflow

After that, the only thing left is to run the workflow:

python workflow.py --config workflow.yml

Take into account that depending on the number of steps, the tools executed and the settings provided, along with the power of your computer, the execution of the workflow can take from a **few minutes** to **several hours**. The workflow progress will be shown in your terminal.

## <a name="get-output"></a>Get output results

Once the workflow is finished, you just should enter the new **wf_name_of_workflow** folder and, inside it, you will find a folder for each step of the workflow with all the files generated in every step.

0 comments on commit 47fa83f

Please sign in to comment.