Skip to content

Commit

Permalink
terminology clarification
Browse files Browse the repository at this point in the history
  • Loading branch information
Ubuntu committed Jan 4, 2020
1 parent c38c80d commit f64111a
Showing 1 changed file with 6 additions and 6 deletions.
12 changes: 6 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,8 @@ An image proccessing service, for using [YOLOv3-object-detection-tutorial](https

This stack presents two ways of running:

1. As a Flask app in the `processing` container which contains all the heavy processing dependencies.
2. As a Flask app in its own lighter `flask` container which passes the processing to the `processing` container via `rq`
1. As a Flask app in the `processor` container which contains all the heavy processing dependencies.
2. As a Flask app in its own lighter `flask` container which passes the processing to the `processor` container via `rq`

Initially I implemented mode (2), then I realised performance was very fast with mode (1): `~1.6` seconds per image conversion on a CPU. However in mode (1) when a request comes in a UWSGI worker which handles the request will block. This is a consideration when scaling, as if you want to handle bulk ingress of images, a CPU core is needed per request, and unavailable while the processing happens. With mode (2) many images can be uploaded quickly, then processing capacity scaled for demand. With the latter mode the endpoint returns a url for the image. Until processing is complete it currently just returns a string 'processing'.

Expand All @@ -17,15 +17,15 @@ Clone repo.
git clone https://github.com/vulcan25/image_processor
cd image_processor

Grab the weights. You may have your own. I used the file from `pjreddie.com/media/files/yolov3.weights`. Put this file in the subdirectory `processor/`. When the `processing` container builds, it copies this in as it then runs the `convert.py` script on the models. With this approach the converted models then become part of the docker image.
Grab the weights. You may have your own. I used the file from `pjreddie.com/media/files/yolov3.weights`. Put this file in the subdirectory `processor/`. When the `processor` container builds, it copies this in as it then runs the `convert.py` script on the models. With this approach the converted models then become part of the docker image.

Build the containers:

docker-compose -f docker-compose.yml -f with-rq-compose.yml build

## mode (1)...

Launch with mode (1) which will expose a Flask app in the `processing` service on `http://localhost:5001`. This means the Flask app exists on the same container as the image processing dependencies.
Launch with mode (1) which will expose a Flask app in the `processor` service on `http://localhost:5001`. This means the Flask app exists on the same container as the image processing dependencies.

docker-compose up

Expand Down Expand Up @@ -75,14 +75,14 @@ Viewing this URL will return the string "processing" unless the job is done, at

# Bug List

- `processing` container image too large at `~3.2GB`
- `processor` container image too large at `~3.2GB`
- `flask/app.py` uses container fs storage (`/tmp`): probably should find a way around this. Maybe S3 via boto3, or similar.
- `processor/app.py` doesn't support any storage; everything is handled in memory. Code could be added to make this store in some manner.
- `app.py` and `index.html` are similar in the processing and flask service respectivly. Find a better way to make these more modular.

# Timing snippet

The following can be run in the processing container:
The following can be run in the `processor` container:

You can get into a running container with: `docker exec -it image_processor_processor_1 /bin/bash`

Expand Down

0 comments on commit f64111a

Please sign in to comment.