Convert a TensorFlow frozen graph to a TensorFlow lite (tflite) file (Part 3)
Converting your inference graph file to a Tensorflow lite (.tflite) file used on mobile.
In Part 2, we have seen how to successfully train our model. If you have not gone through it, click here to learn how to train your own model.
Now the question is; how do we make this model to run in a mobile app given that the file(s) generated after training is not compatible with mobile app environment(Android or iOS)?
This post give detail steps on how to test your trained object detection model on an android app.
Create Frozen graph
We will start by creating a TensorFlow frozen graph with compatible ops that can be used with TensorFlow lite. This is done by running the command below from the object_detection folder
python export_tflite_ssd_graph.py \
--pipeline_config_path=training/ssd_mobilenet_v2_quantized_300x300_coco.config \
--trained_checkpoint_prefix=training/model.ckpt-50233 \
--output_directory=tflite \
--add_postprocessing_op=true
A directory named tflite is created containing two files:tflite_graph.pb
& tflite_graph.pbtxt
. Theadd_postprocessing
flag enables the model to take advantage of a custom optimized detection post-processing operation which can be seen as a replacement for tf.image.non_max_suppression
Now we will convert the frozen graph(tflite_graph.pb) created above to the to the TensorFlow Lite flatbuffer format (detect.tflite). We do this with the command below
tflite_convert \
--graph_def_file=tflite/tflite_graph.pb \
--output_file=tflite/detect.tflite \
--output_format=TFLITE \
--input_shapes=1,300,300,3 \
--input_arrays=normalized_input_image_tensor \
--output_arrays='TFLite_Detection_PostProcess','TFLite_Detection_PostProcess:1','TFLite_Detection_PostProcess:2','TFLite_Detection_PostProcess:3' \
--inference_type=QUANTIZED_UINT8 \
--mean_values=128 \
--std_dev_values=127 \
--change_concat_input_ranges=false \
--allow_custom_ops
This create a file called detect.tflite in the tflite folder. The size of this file is usually below 5MB, mine is 4.5MB. The file size can even be lower than what we have above depending on the size of your inference graph.
To test this file in an android app, start by downloading and running the Object detection android example by TensorFlow . When you are able to run this project successfully on your android phone, now copy the detect.tflite file to the asset folder of your project and name it detectx.tflite.
Also create a text file called labelmap1.txt in the asset folder. The content of this file should be as below
???
person
If you are not using the same classes like me, please use the classes you used to train your model. The classes should be ordered based on the class id you used during training.
Now update the DetectorActivity file. Change as below
#from
TF_OD_API_MODEL_FILE = "detect.tflite";
#to
TF_OD_API_MODEL_FILE = "detectx.tflite";#from
TF_OD_API_LABELS_FILE = "file:///android_asset/labelmap.txt";
#to
TF_OD_API_LABELS_FILE = "file:///android_asset/labelmap1.txt";
Now comes the moment of truth we have been waiting for. Run the android app and enjoy your object(person) detection application.
Hooray you successfully made it.
You can download the project from my GitHub repo here.
Please as always, leave me a comment if you have any question or suggestion. I’m also new to ML and continuously learning. Share if you think it will help another person.