0
0
TensorFlowml~20 mins

Multi-class classification model in TensorFlow - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Multi-class Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Model Choice
intermediate
2:00remaining
Choosing the correct output layer for multi-class classification

You want to build a neural network in TensorFlow to classify images into 5 different categories. Which output layer configuration is correct for this multi-class classification task?

ADense(5, activation='softmax')
BDense(1, activation='sigmoid')
CDense(1, activation='softmax')
DDense(5, activation='sigmoid')
Attempts:
2 left
💡 Hint

For multi-class classification, the output layer should have one neuron per class with a softmax activation.

Metrics
intermediate
2:00remaining
Correct loss function for multi-class classification

Which loss function should you use in TensorFlow when training a multi-class classification model with one-hot encoded labels?

Atf.keras.losses.SparseCategoricalCrossentropy()
Btf.keras.losses.BinaryCrossentropy()
Ctf.keras.losses.MeanSquaredError()
Dtf.keras.losses.CategoricalCrossentropy()
Attempts:
2 left
💡 Hint

One-hot encoded labels require a loss function that compares probability distributions.

Predict Output
advanced
2:00remaining
Output shape of model predictions

Given the following TensorFlow model for 4-class classification, what is the shape of the output predictions for a batch of 10 samples?

TensorFlow
import tensorflow as tf
model = tf.keras.Sequential([
    tf.keras.layers.Dense(16, activation='relu', input_shape=(8,)),
    tf.keras.layers.Dense(4, activation='softmax')
])
import numpy as np
sample_input = np.random.random((10, 8))
predictions = model(sample_input)
predictions_shape = predictions.shape
A(10, 4)
B(4, 10)
C(10, 1)
D(1, 4)
Attempts:
2 left
💡 Hint

The output shape matches the batch size and number of classes.

Hyperparameter
advanced
2:00remaining
Effect of batch size on training stability

In training a multi-class classification model, what is a common effect of increasing the batch size too much?

ATraining becomes more noisy and always improves generalization
BModel always overfits immediately
CTraining becomes less noisy but may converge to sharp minima causing worse generalization
DTraining speed decreases significantly without affecting accuracy
Attempts:
2 left
💡 Hint

Think about how batch size affects gradient noise and generalization.

🔧 Debug
expert
3:00remaining
Identifying the cause of poor multi-class model accuracy

You trained a multi-class classification model with 3 classes using one-hot encoded labels. The model's accuracy stays around 33% (random guess). Which of the following is the most likely cause?

AUsing CategoricalCrossentropy loss with one-hot encoded labels
BUsing SparseCategoricalCrossentropy loss with one-hot encoded labels
CUsing softmax activation in the output layer
DUsing Adam optimizer
Attempts:
2 left
💡 Hint

Check if the loss function matches the label format.