Lobe is a free, easy to use app that has everything you need to bring your machine learning ideas to life. This Flask starter project creates a REST API to get predictions from a TensorFlow model on your projects or apps. To start using it, follow the instructions below:
-
Clone or download the project on your computer to get started. You'll need Python 3.6, 3.7, 3.8, or 3.9 to run this starter project as well.
-
Export a TensorFlow model from Lobe
-
Move the
saved_model.pb
file,variables
folder, andsignature.json
file exported from Lobe to the/model
folder
- Create and activate a virtual environment
python -m venv .venv
.venv\Scripts\activate
- Install dependencies
python -m pip install --upgrade pip && pip install -r requirements.txt
- Run the server locally
python app.py
- Create and activate a virtual environment
python -m venv .venv
source .venv/bin/activate
- Install dependencies
python -m pip install --upgrade pip && pip install -r requirements.txt
- Run the server
python app.py
# or
export FLASK_APP=app.py
flask run
- Have version 2.0.80 or higher of Azure CLI installed.
az --version
- Login by running this command and following prompts
az login
- Deploy to the cloud!
az webapp up --sku B1 --name <your unique app name>
Azure documentation is available if you run into issues. This quick start is a good starting point.
- Perform a post request to the target url/predict with your base64 image. Refer to
testing.py
for getting started sending requests to the server.
{
"image": "<base64 image>"
}
- Successful requests return JSON with the confidences of your predictions.
{
"predictions": [
{
"predicted_label": 0.9105
},
{
"another_label": 0.0895
}
]
}
The Flask starter project is optimized for models exported from Lobe but could be used with any TensorFlow models with some small updates.
Lobe has an endpoint built in called Lobe Connect that can be used while running the app and this starter project works the same way. If your app works with Lobe Connect, it will work with this starter project just by updating the URL.
We are using TensorFlow 2.7.0 to run the tf_example.py file. If you see any GPU errors or want to run the script on GPU please refer to https://www.tensorflow.org/install/gpu
The code takes in a base64
image and returns an array of predictions and confidences. The server code that defines endpoints is in app.py
. And the code for using your model including image pre-processing and output formatting for a prediction is in tf_model_helper.py
. For reference, the Swagger definition file lives in swagger/
.
GitHub Issues are for reporting bugs, discussing features and general feedback on the Flask starter project. Be sure to check our documentation, FAQ and past issues before opening any new ones.
To share your project, get feedback on it, and learn more about Lobe, please visit our community on Reddit. We look forward to seeing the amazing projects that can be built, when machine learning is made accessible to you.