Skip to content

Commit

Permalink
Merge pull request #433 from AllenInstitute/docker
Browse files Browse the repository at this point in the history
try adding docker to workflow
  • Loading branch information
rcpeene authored Nov 26, 2024
2 parents b3446f7 + ede92f4 commit 04a2ee5
Show file tree
Hide file tree
Showing 7 changed files with 112 additions and 76 deletions.
24 changes: 14 additions & 10 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,8 @@ jobs:
build:
runs-on:
group: LargerInstance
container:
image: rcpeene/openscope_databook:latest

env:
DANDI_API_KEY: ${{ secrets.DANDI_API_KEY }}
Expand All @@ -21,27 +23,27 @@ jobs:
- uses: actions/checkout@v3
with:
fetch-depth: 0
ref: main
ref: ${{ github.ref }}

# - name: Set up Python
# uses: actions/setup-python@v4
# with:
# python-version: "3.11"

- name: Upgrading pip
run: pip install --upgrade pip
# - name: Upgrading pip
# run: pip install --upgrade pip

- name: Install deps
run: pip install cython numpy
# - name: Install deps
# run: pip install cython numpy

- name: pip freeze
run: pip freeze

- name: Installing packages again (this prevents a weird error)
run: pip install -r requirements.txt
# - name: Installing packages again (this prevents a weird error)
# run: pip install -r requirements.txt

- name: Installing package
run: pip install -e .
# - name: Installing package
# run: pip install -e .

- name: Installing build dependencies
run: |
Expand Down Expand Up @@ -85,7 +87,9 @@ jobs:
rm ./docs/embargoed/*.nwb
- name: Printing log
run: git status
run: |
git config --global --add safe.directory /__w/openscope_databook/openscope_databook
git status
- name: Printing shortlog
run: git log | git shortlog -sn
Expand Down
18 changes: 10 additions & 8 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,8 @@ jobs:
test:
runs-on:
group: LargerInstance
container:
image: rcpeene/openscope_databook:latest

env:
DANDI_API_KEY: ${{ secrets.DANDI_API_KEY }}
Expand All @@ -19,20 +21,20 @@ jobs:
steps:
- uses: actions/checkout@v3

- name: Upgrading pip
run: pip install --upgrade pip
# - name: Upgrading pip
# run: pip install --upgrade pip

- name: print environment
run: pip freeze

- name: Install cython
run: pip install cython numpy
# - name: Install cython
# run: pip install cython numpy

- name: Installing package
run: pip install -e .
# - name: Installing package
# run: pip install -e .

- name: Installing requirements
run: pip install -r ./requirements.txt
# - name: Installing requirements
# run: pip install -r ./requirements.txt

- name: Installing build dependencies
run: |
Expand Down
22 changes: 22 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
FROM ubuntu:22.04
# base requirements
RUN apt-get update
RUN apt-get install -y coreutils
RUN apt-get install -y libgl1-mesa-glx
RUN apt-get install -y libglib2.0-0
RUN apt-get install -y python3 python3-pip
RUN apt-get install -y git

RUN git config --global --add safe.directory /__w/openscope_databook/openscope_databook

# copy databook setup files
COPY requirements.txt ./openscope_databook/requirements.txt
COPY setup.py ./openscope_databook/setup.py
COPY README.md ./openscope_databook/README.md
COPY LICENSE.txt ./openscope_databook/LICENSE.txt
COPY databook_utils ./openscope_databook/databook_utils

# for reasons I don't understand, these must be installed before the rest the requirements
RUN pip install numpy cython
# set up databook dependencies
RUN pip install -e ./openscope_databook[dev]
89 changes: 35 additions & 54 deletions docs/embargoed/cell_matching.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -50,12 +50,10 @@
"import json\n",
"import os\n",
"\n",
"import matplotlib as mpl\n",
"import matplotlib.pyplot as plt\n",
"import numpy as np\n",
"\n",
"from PIL import Image\n",
"from time import sleep"
"from PIL import Image"
]
},
{
Expand Down Expand Up @@ -93,6 +91,13 @@
"id": "77d78e7d",
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"A newer version (0.63.1) of dandi/dandi-cli is available. You are using 0.61.2\n"
]
},
{
"name": "stdout",
"output_type": "stream",
Expand Down Expand Up @@ -255,66 +260,42 @@
"name": "stderr",
"output_type": "stream",
"text": [
"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\scipy\\__init__.py:169: UserWarning: A NumPy version >=1.18.5 and <1.26.0 is required for this version of SciPy (detected version 1.26.4\n",
" warnings.warn(f\"A NumPy version >={np_minversion} and <{np_maxversion}\"\n",
"WARNING:root:many=True not supported from argparse\n",
"INFO:NwayMatching:NWAY_COMMIT_SHA None\n",
"INFO:NwayMatching:Nway matching version 0.6.0\n",
"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\scipy\\__init__.py:169: UserWarning: A NumPy version >=1.18.5 and <1.26.0 is required for this version of SciPy (detected version 1.26.4\n",
" warnings.warn(f\"A NumPy version >={np_minversion} and <{np_maxversion}\"\n",
"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\scipy\\__init__.py:169: UserWarning: A NumPy version >=1.18.5 and <1.26.0 is required for this version of SciPy (detected version 1.26.4\n",
" warnings.warn(f\"A NumPy version >={np_minversion} and <{np_maxversion}\"\n",
"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\scipy\\__init__.py:169: UserWarning: A NumPy version >=1.18.5 and <1.26.0 is required for this version of SciPy (detected version 1.26.4\n",
" warnings.warn(f\"A NumPy version >={np_minversion} and <{np_maxversion}\"\n",
"WARNING:root:many=True not supported from argparse\n",
"WARNING:root:many=True not supported from argparse\n",
"INFO:PairwiseMatching:Matching 1193675753 to 1194754135\n",
"INFO:PairwiseMatching:Matching 1193675753 to 1194754135: best registration was ['Crop', 'CLAHE', 'PhaseCorrelate']\n",
"multiprocessing.pool.RemoteTraceback: \n",
"\"\"\"\n",
"Traceback (most recent call last):\n",
" File \"C:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\multiprocessing\\pool.py\", line 125, in worker\n",
" result = (True, func(*args, **kwds))\n",
" File \"C:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\multiprocessing\\pool.py\", line 48, in mapstar\n",
" return list(map(*args))\n",
" File \"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\nway\\nway_matching.py\", line 121, in pair_match_job\n",
" pair_match.run()\n",
" File \"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\nway\\pairwise_matching.py\", line 495, in run\n",
" segmask_moving_3d_registered = transform_mask(\n",
" File \"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\nway\\pairwise_matching.py\", line 384, in transform_mask\n",
" dtype=np.int)\n",
" File \"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\numpy\\__init__.py\", line 338, in __getattr__\n",
" raise AttributeError(__former_attrs__[attr])\n",
"AttributeError: module 'numpy' has no attribute 'int'.\n",
"`np.int` was a deprecated alias for the builtin `int`. To avoid this error in existing code, use `int` by itself. Doing this will not modify any behavior and is safe. When replacing `np.int`, you may wish to use e.g. `np.int64` or `np.int32` to specify the precision. If you wish to review your current use, check the release note link for additional information.\n",
"The aliases was originally deprecated in NumPy 1.20; for more details and guidance see the original release note at:\n",
" https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations\n",
"\"\"\"\n",
"\n",
"The above exception was the direct cause of the following exception:\n",
"\n",
"Traceback (most recent call last):\n",
" File \"C:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\runpy.py\", line 196, in _run_module_as_main\n",
" return _run_code(code, main_globals, None,\n",
" File \"C:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\runpy.py\", line 86, in _run_code\n",
" exec(code, run_globals)\n",
" File \"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\nway\\nway_matching.py\", line 502, in <module>\n",
" nmod.run()\n",
" File \"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\nway\\nway_matching.py\", line 462, in run\n",
" self.pair_matches = pool.map(pair_match_job, pair_arg_list)\n",
" File \"C:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\multiprocessing\\pool.py\", line 367, in map\n",
" return self._map_async(func, iterable, mapstar, chunksize).get()\n",
" File \"C:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\multiprocessing\\pool.py\", line 774, in get\n",
" raise self._value\n",
"AttributeError: module 'numpy' has no attribute 'int'.\n",
"`np.int` was a deprecated alias for the builtin `int`. To avoid this error in existing code, use `int` by itself. Doing this will not modify any behavior and is safe. When replacing `np.int`, you may wish to use e.g. `np.int64` or `np.int32` to specify the precision. If you wish to review your current use, check the release note link for additional information.\n",
"The aliases was originally deprecated in NumPy 1.20; for more details and guidance see the original release note at:\n",
" https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations\n"
"c:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\nway\\utils.py:48: FutureWarning: In a future version of pandas all arguments of DataFrame.sort_index will be keyword-only.\n",
" df = df.sort_index(0)\n",
"c:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\nway\\utils.py:49: FutureWarning: In a future version of pandas all arguments of DataFrame.sort_index will be keyword-only.\n",
" df = df.sort_index(1)\n",
"INFO:NwayMatching:registration success(1) or failure (0):\n",
" 0 1\n",
"0 1 1\n",
"1 1 1\n",
"id map{\n",
" \"0\": 1193675753,\n",
" \"1\": 1194754135\n",
"}\n",
"c:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\nway\\nway_matching.py:208: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.\n",
" matching_frame = matching_frame.append(pairframe)\n",
"INFO:NwayMatching:Nway matching is done!\n",
"INFO:NwayMatching:Creating match summary plots\n",
"WARNING:root:setting Dict fields not supported from argparse\n",
"c:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\argschema\\utils.py:346: FutureWarning: '--nway_output.nway_matches' is using old-style command-line syntax with each element as a separate argument. This will not be supported in argschema after 2.0. See http://argschema.readthedocs.io/en/master/user/intro.html#command-line-specification for details.\n",
" warnings.warn(warn_msg, FutureWarning)\n",
"WARNING:root:many=True not supported from argparse\n",
"INFO:NwayMatching:wrote matching_output\\nway_match_fraction_plot_2024_11_14_13_37_50.png\n",
"INFO:NwayMatching:wrote matching_output\\nway_warp_overlay_plot_2024_11_14_13_37_50.png\n",
"INFO:NwayMatching:wrote matching_output\\nway_warp_summary_plot_2024_11_14_13_37_50.png\n",
"INFO:NwayMatching:wrote ./output.json\n"
]
}
],
"source": [
"!python -m nway.nway_matching --input_json input.json --output_json \"./output.json\" --output_dir matching_output"
"!python3 -m nway.nway_matching --input_json input.json --output_json \"./output.json\" --output_dir matching_output"
]
},
{
Expand Down Expand Up @@ -385,7 +366,7 @@
{
"data": {
"text/plain": [
"<matplotlib.image.AxesImage at 0x1c3b53e35b0>"
"<matplotlib.image.AxesImage at 0x21dff47bfa0>"
]
},
"execution_count": 13,
Expand Down Expand Up @@ -421,7 +402,7 @@
{
"data": {
"text/plain": [
"<matplotlib.image.AxesImage at 0x1c3b7dbdf00>"
"<matplotlib.image.AxesImage at 0x21dff4fe680>"
]
},
"execution_count": 14,
Expand Down
20 changes: 19 additions & 1 deletion docs/intro.md
Original file line number Diff line number Diff line change
Expand Up @@ -97,7 +97,9 @@ You can download an individual notebook by pressing the `Download` button in the
```
pip install -e .
```
It is recommended that this is done within a conda environment using Python 3.10 to minimize any interference with local machine environments. For information on installing and using conda, go [here](https://conda.io/projects/conda/en/latest/user-guide/getting-started.html). *Before* running the pip installation above, you can create a conda environment in the conda prompt with the command

#### Locally (Conda)
It is recommended that this is done within a conda environment using Python 3.10 or Docker to minimize any interference with local machine environments. For information on installing and using conda, go [here](https://conda.io/projects/conda/en/latest/user-guide/getting-started.html). *Before* running the pip installation above, you can create a conda environment in the conda prompt with the command
```
conda create -n databook_env python=3.10
```
Expand All @@ -106,6 +108,22 @@ and you can run that environment with
conda activate databook_env
```


#### Locally (Docker)
The Databook also includes a dockerfile. If you want to build a docker container for the Databook yourself (for some reason), you can do so by running the following command in the Databook main directory after you have docker installed and running
```
docker build -t openscope_databook
```
You can then activate the docker by running the following command. Note that, to access the databook in your host machine's web browser, the port 8888 should be mapped to the docker container's port.
```
docker run -p 8888:8888 openscope_databook
```
Instead of building the container yourself, you can use the main docker container that we maintain, registered publically on Docker hub with the following command

Check failure on line 121 in docs/intro.md

View workflow job for this annotation

GitHub Actions / Check for spelling errors

publically ==> publicly
```
docker run -p 8888:8888 rcpeene/openscope_databook:latest
```

#### Locally (Running Notebook)
Once you environment is setup, you can execute the notebooks in Jupyter by running the following command within the repo directory;
```
Jupyter notebook
Expand Down
3 changes: 2 additions & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
autograd==1.3
ccfwidget==0.5.3
cebra
cython
Expand All @@ -21,7 +22,7 @@ quantities==0.14.1
remfile==0.1.10
scikit-image==0.19.3
scipy==1.9.3
ssm
ssm @ git+https://github.com/lindermanlab/ssm
statsmodels==0.14.0
suite2p==0.12.1
tensortools==0.4
Expand Down
12 changes: 10 additions & 2 deletions setup.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
from setuptools import setup, find_packages
from setuptools import setup

with open("README.md", encoding="utf-8") as f:
readme = f.read()
Expand All @@ -20,5 +20,13 @@
url="https://github.com/AllenInstitute/openscope_databook",
license=license,
package_dir={"databook_utils": "databook_utils"},
install_requires=required
install_requires=required,
extras_require={
"dev": [
"markupsafe==2.0.1",
"jupyter-book==1.0.0",
"nbmake==1.5.3",
"pytest-xdist==3.5.0"
]
}
)

0 comments on commit 04a2ee5

Please sign in to comment.