Beyond Jupyter Notebooks

Greg Jan
Nerd For Tech
Published in
6 min readDec 7, 2020

--

Part 1: Non-ML Model deployment with Flask and Docker

Welcome to part 1. Happy you made it through part 0 (Part 0: Setting up a ML project) which is not, let us be honest, the most exciting. In this article you will learn how to deploy a very simple non machine learning model with Flask and build a container with Docker.

More precisely we will:

  • Create a production ready code with our non-machine learning model
  • Test our model
  • Create a web app serving our model
  • Containerize our web app

Non-machine learning model

The non-machine learning model looks like this:

The simplicity of the model should get your eyes off the modeling part. The following Python class Modeler is doing nothing but returning the above function with x as an input in the predict function.

There are a few functions that are not in use in the above piece of code. The reason behind it is that we will use the same code structure in the following articles where the complexity of the model will increase. Basically we will be building on the same code in the future.

Unit testing

Testing is the natural next step since we now have a model. Pytest is a framework that makes it easy to write small tests in Python. To install it run:

conda install -c anaconda pytest

Pytest will run any test code whose file name starts with “test_”. Here is the code that tests our Modeler class:

After importing our Modeler class and the pytest library we use the decorator @pytest.mark.parametrize to define pairs of input value / expected output. In the assert statement we evaluate if the expected outputs are equal to the actual outputs.

To launch the test just write pytest in a terminal. If successful the output should look like this:

Flask app

Now that our test was successful we can build our web app. What is flask you may ask? Flask is a micro framework for web application. In other words it makes creating a web app easy for you. If you want to get more info on Flask check out its website: https://flask.palletsprojects.com/en/1.1.x/.

Let’s break down this program:

  1. First the Flask class is imported along with our Modeler class.
  2. Next we create an instance of this class. The first argument is the name of the application’s module or package. __name__ is the default package.
  3. Then the @app.route decorator tells Flask what URL should trigger our function (in this case localhost:5000/predict) and that the method used will be POST. A POST request means that we will send data to our web app.
  4. Then we enter the post() function:
    4.1. Since we will be sending data as JSON format to our web app we read it using request.get_json() and store it in data.
    4.2. Extract the value of x from data and store it in the variable x.
    4.3. Call our Modeler class into m and feed the function predict with x.
    4.4. Return the input x and the output prediction under JSON format.
  5. The final if condition checks that our Python program app.py has been imported. If not it will launch the Flask server by running on default localhost url (127.0.0.1) through port 5000.

Now all you have to do to run our Flask app is to write in a terminal:

python app.py

Now you should see something like this:

There is no front-end on our web app. So if you open your browser and write the given URL (http://127.0.0.1:5000/) you will get a Not found error message.

That said the back-end should be perfectly working and we can now query our app. Let’s try it!

Here a notebook is a good option (but not mandatory):

We use the request package to send our POST request to the web app. All we have to specify is the URL and the input as JSON format. The model gives us the prediction y = 10000 which seems to me exactly equal to 100².

Containerizing with Docker

We have a now a local web app that is successfully deploying our non-machine learning model. Congratulations!

But we want our app to be seen and shine to everyone in this world, don’t we? Of course we do. To do so we are introducing Docker container. If you are not familiar with Docker containers think of it exactly like containers on a ship. Your program will run in a container isolated from the external word so it runs exactly as wanted and is fully portable. For more info and to get Docker installed: https://docs.docker.com/get-docker/

So all we need now is a Dockerfile to configure our container (given that Docker is installed). You can place the following file on the root of your project folder:

The base image used here is python 3.7-slim (which is Debian based). All the RUN commands update the base image with the latest packages. Two critical lines in the Dockerfile are to copy and install all the python libraries from the requirements.txt file. This means that the same library versions will be used on the container as on your local anaconda environment. To create this file run in a terminal (where your anaconda environment must be activated):

pip freeze > requirements.txt

The COPY line will then copy all the files and folders to the container. Finally the CMD line will launch the web app by running the app.py job to URL 0.0.0.0 and port 8080. This URL is a “all interfaces” address which will be accessible locally by using the localhost.

Don’t forget before building our container to update app.py by un-commenting the line 21 in order to route the app to URL 0.0.0.0 through port 8080 (and comment line 19).

Ok now we are all set. Let’s make that container:

docker build --tag <containername> .

It may take a bit of time depending on the speed of your connection. But once it is done you can launch the container:

docker run  -p 8080:8080 <containername>:latest

Next step, we test. As in our local version we can run the following notebook, only updating the port from 5000 to 8080:

We get the same response as we got locally. You have made it. CONGRATULATIONS!

Now we could register this container to any Cloud vendors such as GCP, Azure or AWS and make it accessible to the world. But I will not cover it in this article.

Even though our model was very simple there were still quite a bit of learning. Hopefully that has given you a better picture on how to deploy a model. Jupyter Notebooks were not used expect for testing our app. Instead we have used production code and configuration files.

You can access all the codes in my GitHub repo: https://github.com/GregoireJan/xflask

In the next article we will make a nice front-end to our web app by using Streamlit!

Here are the parts of the series Beyond Jupyter notebooks:

--

--