Dockerized local dev environment to test Argo DB django app with BGS data processing app
Install Docker on your computer: https://docs.docker.com/engine/install/
- copy this repo to your local computer:
git clone https://github.com/WHOIGit/argodb-docker-local.git
- move into this repo directory
cd argodb-docker-local
- install sub-module repos:
git submodule init
git submodule update
- create two new local directories for testing data and output files:
mkdir bgc-processing-data
mkdir testing-data
- you should now have the following directory structure:
- argo-db-backend/
- bgc-processing/
- bgc-processing-data/
- testing-data/
- docker-compose.yml
- create
.env.local
files forargo-db-backend/
andbgc-processing/
directories. Copy the.env.example
file and rename to.env.local
for each directory. Add values to the empty environmental variable for DB_PASSWORD/POSTGRES_PASSWORD - build local docker images:
docker compose build
- download example database file: Download SQL file
- seed local database with downloaded data (the
restore
command way take a few minutes):
docker compose up -d postgres
docker cp /local/path/to/file/bgc-db.sql.gz postgres:backups/
docker compose exec postgres restore bgc-db.sql.gz
docker compose down
cd
into theargodb-docker-local
repo directory- run
docker compose up
to start the full application stack. (usedocker compose up -d
if you want the containers to run in the background and not display output)
This will start the Django application, the Postgres DB, and the BGC Processing application.
You can access the Django application at: http://localhost:8000/metadata-admin/
. You can use your current login info from the production site.
You can run Django management commands by executing docker compose run --rm argodb python manage.py <command_name>
. All commands should be executed in the argodb-docker-local
directory.
For example, to add a new admin user to access the Django application, you can run:
docker compose run --rm argodb python manage.py createsuperuser
- When the
bcg-processing
container starts, it will automatically execute thenavis_batch_process.py
function, parsing any test data that you place in thetesting-data
directory. The container follows the same naming conventions as the production application, so data should be in a deployment specific subdirectory using thewn1234
format as a name. Ex:
- testing-data/
- wn1234/
- 1234.000.isus
- 1234.001.log
- 1234.001.msg
- The container will stop after the data parsing is complete. To run it again, simply start it up again:
docker compose up bgc-processing
- Data will be added to the local Django application as it's parsed. You can can verify the new data at
http://localhost:8000/metadata-admin/
- run
docker compose down
in theargodb-docker-local
directory to stop all containers.