Serving and Deploying Keras Models using Flask, UWSGI, NGINX and Docker

Anyone who has tried (and failed) deploying their Keras models will understand the frustration of getting your trained models out there on the cloud. It is all well and good having all of the code sitting their in a Jupyter Notebook, but eventually you will want to deploy and run these models at scale. Hopefully this guide will help!

The Problem

There really isn’t that much documentation out there on how to serve a Keras model publicly, and much of the complexity in terms of setting up tools such as flask, uwsgi and nginx further add to the complexity, especially if you come from the Go or Node world. DigitalOcean provide a reasonable guide for setting up this stack on their virtual machines, but unfortunately it breaks when you try and predict using a Keras model due to thread safety, and takes quite a lot of research and tweaking to get it to run on this stack. All of this leads to a lot of wasted time fiddling around configuring a VM.

Enter Docker

Docker operates on the container level, rather than the VM level. Eschewing all of the technicalities of this, it means that you can focus on writing your code and leave the configuration to someone else out there on the internet. I personally have created a docker image that installs of the serving stack, so nginx, uwsgi and flask with all of the necessary configuration bundled in. Once you have written and packaged your code into this container, it can be deployed to Google Cloud, Azure, AWS or DigitalOcean with no need to concern yourself with config.

So lets assume the following application tree:

Now we populate the Dockerfile.

Note that the joelogan/keras-tensorflow-flask-uwsgi-nginx-docker image installs all of the serving frameworks, Python and a number of dependencies, such as Keras, TensorFlow, Pillow, Matplotlib and H5PY so that you can get up and running with serving your models easily. If you need any additional libraries (for example, Pandas) you can do it as follows in the Dockerfile:

Then we run docker build which will spin up a container inheriting all of the serving code (and Python, TensorFlow, Matplotlib and a few other dependencies) and copy your app code into the container.

Docker will proceed to download all of the dependencies and set up the container. Then we can run it and test it on your local machine:

Or deploy it somewhere else, such as a DigitalOcean VM or Google Cloud.

And a big shout out to tiangolo, who I inherited the base for the keras-tensorflow-flask-uwsgi-nginx-docker image from.

Developer and AI enthisiast from Sydney. Founder of Alixir. Check me out @ https://jlgn.io

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store