Dockerizing a Hand Written Digits Predictor Service

Source: Deep Learning on Medium


Keras is a library that allows rapid prototyping for deep learning using TensorFlow. An important part of a good prototype is to allow members of a development team to install and modify a project easily. Thus, This post explains how to dockerize a simple prediction service written in Python 3 + Keras and exposed as a Flask Service.

The project creates the service using the following tools:

  • Docker: Creates a container for the application.
  • Conda: Manages python packages within the container.
  • Flask: Exposes a REST endpoint the with 2 main functions: train and classify.
  • Keras/TensorFlow: Builds the Deep Learning module and perform the digit classification.

Installing The project

You can clone the project from GitHub to have a local copy of the project.

git clone https://github.com/ronald-smith-angel/predictor-flask-hd

Python Packages

The manager minicondas uses the file environment.yml to build the python environment. This file contains:

name: hm-predictor
channels:
— defaults
— conda-forge
dependencies:
— _tflow_190_select=0.0.3=mkl
— absl-py=0.3.0=py36_0
— astor=0.7.1=py36_0
— blas=1.0=mkl
— ca-certificates=2018.03.07=0
— certifi=2018.4.16=py36_0
— gast=0.2.0=py36_0
— grpcio=1.12.1=py36hdbcaa40_0
— h5py=2.8.0=py36h8d01980_0
— hdf5=1.10.2=hba1933b_1
— intel-openmp=2018.0.3=0
— keras=2.2.2=0
— keras-applications=1.0.4=py36_0
— keras-base=2.2.2=py36_0
— keras-preprocessing=1.0.2=py36_0
— krb5=1.16.1=hc83ff2d_6
— libedit=3.1.20170329=h6b74fdf_2
— libffi=3.2.1=hd88cf55_4
— libgcc-ng=7.2.0=hdf63c60_3
— libgfortran-ng=7.2.0=hdf63c60_3
— libpq=10.4=h1ad7b7a_0
— libprotobuf=3.5.2=h6f1eeef_0
— libstdcxx-ng=7.2.0=hdf63c60_3
— markdown=2.6.11=py36_0
— mkl=2018.0.3=1
— mkl_fft=1.0.4=py36h4414c95_1
— mkl_random=1.0.1=py36h4414c95_1
— ncurses=6.1=hf484d3e_0
— numpy=1.15.0=py36h1b885b7_0
— numpy-base=1.15.0=py36h3dfced4_0
— openssl=1.0.2o=h14c3975_1
— pip=10.0.1=py36_0
— protobuf=3.5.2=py36hf484d3e_1
— python=3.6.6=hc3d631a_0
— pyyaml=3.13=py36h14c3975_0
— readline=7.0=ha6073c6_4
— scipy=1.1.0=py36hc49cb51_0
— setuptools=39.2.0=py36_0
— six=1.11.0=py36_1
— sqlite=3.24.0=h84994c4_0
— tensorboard=1.9.0=py36hf484d3e_0
— tensorflow=1.9.0=mkl_py36h6d6ce78_1
— tensorflow-base=1.9.0=mkl_py36h2ca6a6a_0
— termcolor=1.1.0=py36_1
— tk=8.6.7=hc745277_3
— werkzeug=0.14.1=py36_0
— wheel=0.31.1=py36_0
— xz=5.2.4=h14c3975_4
— yaml=0.1.7=had09818_2
— zlib=1.2.11=ha838bed_2
— flask=0.12.2=py36_0
— pillow=4.2.1=py36_0

Dockerization

Once you have a local copy of the code, you will need to have docker installed in your local machine. Example for ubuntu:

sudo apt install docker-ce

Within the project main folder, you will find the docker file used:

# Downloading latest Condas image from Repository.
FROM continuumio/miniconda3:latest

# Defining working Folder.
WORKDIR /app

# Install classifier_env Requirements.
COPY environment.yml /app/environment.yml

# Creating Condas Environment.
RUN conda env create -n classifier_env -f environment.yml

# Install Application withing Container.
COPY . /app/

# Activate the classifier_env Environment.
ENV PATH /opt/conda/envs/classifier_env/bin:$PATH

# Adding command to deploy service.
CMD python -u service.py

This Docker defines the following procedure:

  • Downloads the public continuumio/miniconda3:latest image.
  • Creates a local environament called classifier_env using miniconda3.
  • Installs the code downloaded in a container folder called app.
  • Activates the environment and runs the flask service.

Building Docker Image

Build your docker container using the Dockerfile.

docker build -t “hm-predictor:dockerfile” .

Run your application.

docker run -p 5000:5000 “hm-predictor:dockerfile”.

Running service calls

After the service is running, we can call the service endpoints on this way (using sample data):

Training:

curl -X POST -H “Content-Type: application/json” -d ‘{“images_batch_path”: “mnist.npz”}’ http://localhost:5000/digits/train

sample Output: True.

Prediction:

curl -X POST -H “Content-Type: application/json” -d ‘{“image_path”: “test/data/n5.png”}’ http://localhost:5000/digits/classify

sample Output: [5].

Running the tests

The unit test is located at test.classification_service_test.py

to run add:


python -m classification_service_test.py

After you follow all these steps a basic docker environment for your machine learning pipeline will be running.