Handwriting number recognizer with Flutter and Tensorflow (part II)

Source: Deep Learning on Medium

Creating the project

Let’s start creating a Flutter application! I’ll call the project handwritten_number_recognizer but feel free to name it as you prefer.

Once you have the project created, you should be seeing something like this:

The first thing we are going to do is to clean that main.dart file and let it as follows:

import 'package:flutter/material.dart';

void main() => runApp(HandwrittenNumberRecognizerApp());

class HandwrittenNumberRecognizerApp extends StatelessWidget {

Widget build(BuildContext context) {
return MaterialApp(
title: 'Number Recognizer',
theme: ThemeData(
primarySwatch: Colors.blue,

You will notice we are letting the home parameter empty at the moment, we will be right back to that.

Creating the scene

Now, beside your lib/main.dart file, create a new file called recognizer_screen.dart and paste the following:

import 'package:flutter/material.dart';

class RecognizerScreen extends StatefulWidget {
RecognizerScreen({Key key, this.title}) : super(key: key);

final String title;

_RecognizerScreen createState() => _RecognizerScreen();

class _RecognizerScreen extends State<RecognizerScreen> {
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(
title: Text(widget.title),
body: Container(
child: Text('My screen'),

This is going to be the screen where we are going to work, so we have the code separated from the main app class.

Do you remember that home parameter that we left empty? Go back o main.dart and import the new file we just created:

import 'package:handwritten_number_recognizer/recognizer_screen.dart';

And fill the value in the home parameter:

home: RecognizerScreen(title: 'Number recognizer',),

If you are wondering why those commas at the end of each “part” of the line, try reformatting the code with dartfmt. If you are in Android Studio with the Dart plugin installed, right click in the editor should show that option. There is also a Dart plugin for Visual Studio Code to run this formatting tool.

At this point, if you run the app, you should be seeing something like this:

Our app’s first screen

Let’s import stuff

There are two things we are going to need to import, the model and the tensorflow library; and for both of them we will need to modify the pubspec.yaml file.

Importing the model

Remember the exported model we got from the previous article? Let’s bring it to our app. For doing so, we are going to create a folder called assets at the project root level, and we will drag and drop the model into this folder. It should look something like this:

We will also create a labels.txt file as you can see in the image above, simply fill this file with digits from 0 to 9 in a new line each, as follows:


This is the label that will be associated with the output of our model later. In the case of the digits it is a no brainer, the output node 9 will correspond with the label 9; but imagine that we were trying to classify something else, like types of flowers. Then maybe the output 1 would correspond to a rose. This labels file is the one that will output that correspondence for us.

Our prediction, and we will see this later, is going to return an index, a label and a probability. The label value will be taken from labels.txt in our case.

Not everything is done yet! In Flutter, we need to declare our assets in the pubspec.yaml file that we mentioned earlier. If we open this file and get rid of most of the commented code, we have this:

name: handwritten_number_recognizer
description: A handwritten number recognizer built with Flutter and Tensorflow.

version: 1.0.0+1

sdk: ">=2.1.0 <3.0.0"

sdk: flutter

cupertino_icons: ^0.1.2

sdk: flutter

uses-material-design: true

# To add assets to your application, add an assets section, like this:
# assets:
# - images/a_dot_burr.jpeg
# - images/a_dot_ham.jpeg

This file is written in yaml and therefore the indentation is the key. Watch out no to add any extra tab by accident.

I left the comments related to the assets on purpose. We have two options, we could bring all the assets folder as assets, or select the assets we want to bring. Let’s follow this second option and replace the commented code by this lines:

- assets/converted_mnist_model.tflite
- assets/labels.txt

Please, bear in mind that assets should be indented at the same level than uses-material-design

Importing Tensorflow Lite

The next thing we need is a Tensorflow lite library, so we can run the inference (prediction) of our model locally in the device. Unfortunately, there’s no official support from Tensorflow for Flutter just yet, but a pretty good library already exists called tflite, you can find a link to it here.

If you open the link and go to the installing tab, it will tell us what do we need to do, basically add it to our dependencies, so we do that in the pubspec.yaml and that part should look like this:

sdk: flutter
cupertino_icons: ^0.1.2
tflite: ^1.0.4

Then we only need to get our new packages. If you use the command line, just run:

flutter pub get

However, if you are using the Flutter plugin in Android Studio, there should be a button called Packages get at the top right of your editor, you can press it to get the new packages.

There is one thing left to have tflite ready to use in our app. If you noticed, in the pub.dev page for tflite there is a table content and the first section is Installation. In there they detail a manual step required to get it working with Android and a troubleshooting one for iOS, as you can see in the image bellow.

Tflite installation page


Also you will need (at the moment of writing this article) a sdk min version of 19 for Android. You can change this in android/app/build.gradle (same file than the one you added the aaptOptions above).

minSdkVersion 19


Tflite requires a minimum deployment version of 9.0 or higher.