Let us deploy Deep learning TensorFlow model on edge devices using TF Lite.

There are three different ways we can use TensorFlow lite converter

- Convert TF SaveModel to TF Lite
- Convert Keras PreBuilt Model to TF Lite
- Concrete Function to TF Lite

**Convert TF SaveModel to TF Lite:-**

Let us create a simple model using TensorFlow and save that model using the TF SaveModel. To develop this model we will use TensorFlow API. In this example, we will show how to convert SaveModel into TF Lite FlatBuffer.

# we will train

`import tensorflow as tf# Construct a basic TF model.root = tf.train.Checkpoint()root.v1 = tf.Variable(3.)root.v2 = tf.Variable(2.)root.f = tf.function(lambda x: root.v1 * root.v2 * x)`

`# Save the model into temp directoryexport_dir = "/tmp/test_saved_model"input_data = tf.constant(1., shape=[1, 1])to_save = root.f.get_concrete_function(input_data)tf.saved_model.save(root, export_dir, to_save)`

`# Convert the model into TF Lite.converter = tf.lite.TFLiteConverter.from_saved_model(export_dir)tflite_model = converter.convert()`

#save model

tflite_model_files = pathlib.Path(‘/tmp/save_model_tflite.tflite’)

tflite_model_file.write_bytes(tflite_model)

**2. Convert Keras PreBuilt Model to TF Lite:-**

In this section, we have explored how to convert the prebuilt Keras model into the TF lite model. We will run inference on a pre-trained `tf.keras`

MobileNet model to TensorFlow Lite.

`import numpy as npimport tensorflow as tf`

`# Load the MobileNet keras model.# we will create tf.keras model by loading pretrained model on #imagenet dataset`

`model = tf.keras.applications.MobileNetV2( weights="imagenet", input_shape=(224, 224, 3))`

# here we pretrained model no need use SaveModel

# here we will pass model directly to TFLiteConverter

`converter = tf.lite.TFLiteConverter.from_keras_model(model)tflite_model = converter.convert()`

#if you want to save the TF Lite model use below steps or else skip

tflite_model_files = pathlib.Path(‘/tmp/pretrainedmodel.tflite’)

tflite_model_file.write_bytes(tflite_model)

`# Load TFLite model using interpreter and allocate tensors.interpreter = tf.lite.Interpreter(model_content=tflite_model)interpreter.allocate_tensors()`

**3. Concrete Function to TF Lite:- **

In order to convert TensorFlow 2.0 models to TensorFlow Lite, the model needs to be exported as a concrete function. If you have developed your model using TF 2.0 then this is for you. We will convert concrete function into the TF Lite model. In this section also we will use the Keras MobileNet model.

`import tensorflow as tf`

# load mobilenet model of keras`model = tf.keras.applications.MobileNetV2(weights="imagenet", input_shape=(224, 224, 3))`

We will tf.function to create a callable tensorflow graph of our model.

`#get callable graph from model. `

run_model = tf.function(lambda x: model(x))

# to get the concrete function from callable graph

concrete_funct = run_model.get_concrete_function(tf.Tensorpec(model.inputs[0].shape, model.inputs[0].dtype))

#convert concrete function into TF Lite model using TFLiteConverter

converter =`tf.lite.TFLiteConverter.from_concrete_functions([`

concrete_funct`])`

tflite_model = converte.convert()

#save model

tflite_model_files = pathlib.Path(‘/tmp/concretefunc_model.tflite’)

tflite_model_file.write_bytes(tflite_model)

**CLI TF Lite Converter:-**

Apart from this python API we can also use Command Line Interface to convert model. TF lite converter to convert SaveModel to the TFLite model.

The TensorFlow Lite Converter has a command-line tool

`tflite_convert`

which supports basic models.

#! /usr/bin/env/ bash

`tflite_convert = ``--saved_model_dir=/tmp/mobilenet_saved_model \`

--output_file=/tmp/mobilenet.tflite

` --output_file`

. Type: string. Specifies the full path of the output file.

`--saved_model_dir`

. Type: string. Specifies the full path to the directory containing the SavedModel generated in 1.X or 2.X.

` --keras_model_file`

. Type: string. Specifies the full path of the HDF5 file containing the `tf.keras`

model generated in 1.X or 2.X.

#! /usr/bin/env/ bash

`tflite_convert = ``--keras_model_file=model.h5 \`

--output_file=/tmp/mobilenet_keras.tflite

The converter supports SavedModel directories,

`tf.keras`

models, and concrete functions.

For now, we will end off with these options only. Next article we will explore converting RNN model and Quantized Models.

You must log in to post a comment.