0
0
TensorFlowml~15 mins

Loading and inference in TensorFlow - ML Experiment: Train & Evaluate

Choose your learning style9 modes available
Experiment - Loading and inference
Problem:You have a trained TensorFlow model saved on disk. You want to load this model and use it to make predictions on new data.
Current Metrics:No current metrics because the model is not yet loaded or used for inference.
Issue:Learners often struggle with correctly loading a saved model and running inference to get predictions.
Your Task
Load the saved TensorFlow model from disk and run inference on a given input sample. Output the predicted class probabilities.
Use TensorFlow 2.x API for loading and inference.
Do not retrain or modify the model architecture.
Input data must be preprocessed as required by the model.
Hint 1
Hint 2
Hint 3
Solution
TensorFlow
import tensorflow as tf
import numpy as np

# Load the saved model from disk
model = tf.keras.models.load_model('saved_model/my_model')

# Example input: a single sample with shape matching model input
# For example, if model expects 28x28 grayscale images:
input_sample = np.random.rand(1, 28, 28, 1).astype(np.float32)  # batch size 1

# Run inference
predictions = model.predict(input_sample)

# Print predicted class probabilities
print('Predicted probabilities:', predictions)
Loaded the saved model using tf.keras.models.load_model().
Created a dummy input sample with correct shape and type.
Used model.predict() to get output probabilities.
Results Interpretation

Before: No model loaded, no predictions possible.

After: Model loaded successfully and predictions generated as probability scores for each class.

Loading a saved TensorFlow model and running inference requires correct loading API, proper input shape, and using model.predict() to get output probabilities.
Bonus Experiment
Try loading the model and running inference on multiple input samples at once (batch inference).
💡 Hint
Prepare a batch input array with shape (batch_size, height, width, channels) and pass it to model.predict().