Deploy MobileNet model to Android Platform

Source: Deep Learning on Medium

Deploy MobileNet model to Android Platform

Hello again,

This blog is a continuation of the previous blog name “Memes Detection Android App using Deep Learning”. In this blog, we are going to deploy the model on an android environment.

Please go through my previous blog to understand the business problems and machine learning formulation for this problem.

Part 2

In the previous blog, we build VGG19 model here we are going to fine-tune a pre-trained MobileNet Keras model.

What we are going to learn in this blog?

  • Convert Keras model to TFLite model
  • Deploy the model on an android platform

Below is a code snippet to fine-tune MobileNet Model.


IMG_SIZE = 224

mobilenet_v1 = keras.applications.mobilenet.MobileNet(include_top=False, input_shape=IMG_SHAPE, )

prediction_layer = keras.layers.Dense(2, activation='softmax')

flatter = keras.layers.Flatten()

model = keras.Sequential([


for layer in model2.layers:
print(layer, layer.trainable)


earliStop = EarlyStopping(monitor='val_loss', mode='min', verbose=1, patience=2)

history = model.fit_generator(
MobileNet graph; training loss vs validation loss


Training LogLoss = 0.13648
Training Accuracy = 96.644 %

Validation LogLoss = 0.18804
Validation Accuracy = 95.446 %

Test LogLoss = 0.22778
Test Accuracy = 95.323 %

Convert Keras model to TFLite model

converter=tf.lite.TFLiteConverter.from_keras_model(model)tflite_model = converter.convert()

then we can save the model using below code.

open("/content/drive/My Drive/mobilenet/best_model3.tflite", "wb").write(tflite_model)

Note: We have stored the model using .tflite file extension.

If you know the basics knowledge of android development then deploying “TF-Lite” model is trivial.

We have to appreciate this google codelab code snippet.

image borrow:

Deploy the model on an android platform

The below step will helpful for you.

  • After converting Keras model to TFlite model as shown above, save file as .tflite file extension.
  • Create a simple android project, using Android Studio.
  • In android project directory, create a folder called ‘Assets’.
  • Store “model.tflite” file into that “Assets” folder.
  • Create a “class_labels.txt” file in which output categories file. In our case, it should contain two class labels ie “Meme” and “Not Meme”. If you are working with 1000 classification problem then “class_labels.txt” will contain 1000 class labels.
  • In google codelab project there is ‘ImageClassifier.Java’ file. For reference. It is an interpreter in google codelab case. Interpreter talks between .tflite model and android /java/ c++ client.

In Android, “build.gradle” file add below lines. Add dependency.

android {
aaptOptions {
noCompress "tflite"
noCompress "lite"
dependencies {
implementation 'org.tensorflow:tensorflow-lite:+'

final and last step, just added interpreter class.

Let’s understand our Interpreter class. Our interpreter class name is, which can be found here.

private static final String MODEL_PATH = "model.tflite";private static final String LABEL_PATH = "labels.txt";

Below code load .tflite model.

## Load .tflite model/**
* Memory-map the model file in Assets.
private MappedByteBuffer loadModelFile(Context activity) throws IOException {
AssetFileDescriptor fileDescriptor = activity.getAssets().openFd(MODEL_PATH);
FileInputStream inputStream = new FileInputStream(fileDescriptor.getFileDescriptor());
FileChannel fileChannel = inputStream.getChannel();
long startOffset = fileDescriptor.getStartOffset();
long declaredLength = fileDescriptor.getDeclaredLength();
return, startOffset, declaredLength);

below code load class names.

* Reads label list from Assets.
private List<String> loadLabelList(Context activity) throws IOException {
List<String> labelList = new ArrayList<String>();
BufferedReader reader =
new BufferedReader(new InputStreamReader(activity.getAssets().open(LABEL_PATH)));
String line;
while ((line = reader.readLine()) != null) {
return labelList;

below method classify an image.

* Classifies a frame from the preview stream.
public String classifyImages(Bitmap bitmap) {
if (tflite == null) {
Log.e(TAG, "Image classifier has not been initialized; Skipped.");
return "Uninitialized Classifier.";
// Here's where the magic happens!!!
long startTime = SystemClock.uptimeMillis();, labelProbArray);
long endTime = SystemClock.uptimeMillis();
Log.d(TAG, "Timecost to run model inference: " + Long.toString(endTime - startTime));

float findProba0 = labelProbArray[0][0];
float findProba1 = labelProbArray[0][1];

String result;
if (findProba1 < findProba0) {
} else {
return result;
  • convertBitmapToByteBuffer: this method is used to convert image to array.


Run the android studio project and supply test images to “classifyFrame” method. It will return whether an image is a Meme or Not a Meme.

Source code can be found here.

Also, don’t miss deploying Keras model to the flask app.


Android App Landing Page

1. Check single Images Locally

It means Mobile-Net tflite model is running locally and testing only a single image.

Single Image Locally test

2. Check Multiple Images Locally

It means Mobile-Net tflite model is running locally and we are testing multiple images.

An image that is marked as Meme is denoted using this STAMP image.

multiple images check locally


Also, don’t miss deploying Keras model to the flask app part3

Please appreciate our efforts if you like this blog.

Thank you for your time.

You can check out the similar interesting blog here and here.

Github repo for this project is here.