Original article can be found here (source): Artificial Intelligence on Medium
Keras Model For Tensorflow Serving (TF serving with Keras)
In this article, we will figure out an effective way of how we can actually use Keras Model for Tensorflow Serving.
We will see how to convert a Custom build Keras Model and also Pretrained(Inbuilt) Keras Model, to use with Tensorflow Serving.
Convert Custom Keras Model:
Assuming that you are familiar with building Keras model.
Now let’s define a simple neural network using Keras
model = Sequential()
model.add(Dense(12, input_dim=8, activation=’relu’))
compile the model
model.compile(loss=’binary_crossentropy’, optimizer=’adam’, metrics=[‘accuracy’])
Now let’s save this model as a “.h5” file.
This will save our model as “my_keras_model.h5” in our working directory.
Now let’s begin the conversion process.
Make sure that you have Tensorflow≥2.0, I recommend doing it in a virtual environment so that your base environment doesn’t get affected.
Firstly, We will load our saved model using the Keras from the TensorFlow module, which will let us convert the model.
import tensorflow as tf
model = tf.keras.models.load_model('my_keras_model.h5')
Now we will convert the Model to the format required by Tensorflow Serving.
Note that we are importing Keras from the Tensorflow module.
from tensorflow import keraskeras.experimental.export_saved_model(model,'converted_model/1/')
Tensorflow Serving will expect our model to be in a specific format. So that it can take care of versioning. More on Tensorflow Serving Model versioning in the upcoming articles.
The above code will create a directory named “converted_model” and directory named “1” representing that it is the first version of the model.
Now our Custom Keras Model is ready to use with Tensorflow Serving.
Convert Pretrained(Inbuilt) Keras Model:
For this illustration, we use the Pretrained “VGG 16” Model which is present in Keras applications.
According to our conversion process, firstly we need the “.h5” file of the “VGG 16” model.
We can either download the “.h5” file from their GIT Repository or we can load the model which got downloaded earlier when we have used it for the first time and save it as “.h5”.
Here we will use the second approach because the first one is straight forward, and this might help you with any future use-case.
from keras.applications.vgg16 import VGG16
model = VGG16()
Now we have loaded the model.
import tensorflow as tf
model = tf.keras.models.load_model('VGG_16.h5')
This will save our model as the “.h5” file.
And we follow the same steps (refer to “now let’s begin the conversion process” above) that we have done earlier to convert a “.h5” file to the format that is required by Tensorflow Serving.
Stay tuned for the next articles on how we actually use this converted model for Tensorflow Serving.