0
0
TensorFlowml~5 mins

TensorFlow Lite conversion

Choose your learning style9 modes available
Introduction
TensorFlow Lite conversion makes your machine learning models small and fast so they can run on phones and other small devices.
You want to run a model on a smartphone without internet.
You need your model to use less battery and memory.
You want to add AI features to a small gadget like a smartwatch.
You want faster predictions on devices with limited power.
You want to share your model with others to use on mobile apps.
Syntax
TensorFlow
import tensorflow as tf

# Load your trained model
model = tf.keras.models.load_model('model.h5')

# Create a converter from the model
converter = tf.lite.TFLiteConverter.from_keras_model(model)

# Convert the model to TensorFlow Lite format
lite_model = converter.convert()

# Save the converted model
with open('model.tflite', 'wb') as f:
    f.write(lite_model)
Use from_keras_model() if you have a Keras model saved or built in memory.
The converted model is saved as a .tflite file which is smaller and optimized.
Examples
Convert a model saved in TensorFlow SavedModel format.
TensorFlow
converter = tf.lite.TFLiteConverter.from_saved_model('saved_model_dir')
lite_model = converter.convert()
Convert a Keras model with optimization to reduce size and improve speed.
TensorFlow
converter = tf.lite.TFLiteConverter.from_keras_model(model)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
lite_model = converter.convert()
Convert a TensorFlow ConcreteFunction to TensorFlow Lite format.
TensorFlow
converter = tf.lite.TFLiteConverter.from_concrete_functions([func])
lite_model = converter.convert()
Sample Model
This code trains a small model on random data, converts it to TensorFlow Lite format, and prints the size of the converted model.
TensorFlow
import tensorflow as tf
import numpy as np

# Create a simple model
model = tf.keras.Sequential([
    tf.keras.layers.Dense(10, activation='relu', input_shape=(5,)),
    tf.keras.layers.Dense(1)
])

# Compile and train the model on dummy data
model.compile(optimizer='adam', loss='mse')
x_train = np.random.random((100, 5))
y_train = np.random.random((100, 1))
model.fit(x_train, y_train, epochs=2, verbose=1)

# Convert the trained model to TensorFlow Lite
converter = tf.lite.TFLiteConverter.from_keras_model(model)
lite_model = converter.convert()

# Save the TensorFlow Lite model
with open('simple_model.tflite', 'wb') as f:
    f.write(lite_model)

print('TensorFlow Lite model size:', len(lite_model), 'bytes')
OutputSuccess
Important Notes
TensorFlow Lite models are much smaller and faster but may lose some accuracy.
You can apply optimizations like quantization during conversion to make models even smaller.
Always test the converted model on your device to check performance and accuracy.
Summary
TensorFlow Lite conversion makes models small and fast for mobile and embedded devices.
You convert models using TFLiteConverter from Keras, SavedModel, or ConcreteFunction.
Saving the converted model creates a .tflite file ready for deployment.