Mobile deployment lets you run AI models directly on phones. This makes apps faster and works without internet.
0
0
Mobile deployment (TFLite, Core ML) in Computer Vision
Introduction
You want a photo app that recognizes objects instantly on your phone.
You need a voice assistant that works offline on mobile devices.
You want to save data by processing AI tasks on the phone instead of the cloud.
You are building a fitness app that tracks movements using the phone camera.
You want to add AI features to an app without slowing it down.
Syntax
Computer Vision
For TensorFlow Lite (TFLite): import tensorflow as tf # Convert a TensorFlow model to TFLite format converter = tf.lite.TFLiteConverter.from_saved_model('saved_model_dir') tflite_model = converter.convert() # Save the TFLite model with open('model.tflite', 'wb') as f: f.write(tflite_model) For Core ML (Apple devices): import coremltools as ct # Convert a TensorFlow or PyTorch model to Core ML coreml_model = ct.convert(model) # Save the Core ML model coreml_model.save('model.mlmodel')
TFLite is for Android and many platforms; Core ML is for Apple devices.
Conversion changes the model to a smaller, faster format for mobile.
Examples
This converts a saved TensorFlow model to TFLite format and saves it.
Computer Vision
import tensorflow as tf converter = tf.lite.TFLiteConverter.from_saved_model('my_model') tflite_model = converter.convert() with open('model.tflite', 'wb') as f: f.write(tflite_model)
This converts a TensorFlow model to Core ML format for iOS apps.
Computer Vision
import coremltools as ct coreml_model = ct.convert(my_tf_model) coreml_model.save('MyModel.mlmodel')
Sample Model
This code trains a small model, saves it, converts it to TFLite, and prints the TFLite model size.
Computer Vision
import tensorflow as tf import numpy as np # Create a simple TensorFlow model model = tf.keras.Sequential([ tf.keras.layers.Dense(3, activation='relu', input_shape=(2,)), tf.keras.layers.Dense(1, activation='sigmoid') ]) model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy']) # Train on dummy data x_train = np.array([[0,0],[0,1],[1,0],[1,1]], dtype=np.float32) y_train = np.array([0,1,1,0], dtype=np.float32) model.fit(x_train, y_train, epochs=5, verbose=1) # Save the model model.save('saved_model') # Convert to TFLite converter = tf.lite.TFLiteConverter.from_saved_model('saved_model') tflite_model = converter.convert() with open('model.tflite', 'wb') as f: f.write(tflite_model) print('TFLite model size:', len(tflite_model), 'bytes')
OutputSuccess
Important Notes
Always test the converted model on the target device to check speed and accuracy.
Some model features may not convert perfectly; keep models simple for mobile.
Use quantization during conversion to make models smaller and faster.
Summary
Mobile deployment runs AI models directly on phones for speed and offline use.
TFLite is for Android and Core ML is for Apple devices.
Convert your trained model to these formats before adding to mobile apps.