0
0
TensorFlowml~20 mins

Batch size and epochs in TensorFlow - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Batch Size and Epochs Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
Effect of Batch Size on Training Speed

Which statement best describes how increasing the batch size affects the training speed of a neural network?

AIncreasing batch size always slows down training because more data is processed at once.
BIncreasing batch size generally speeds up training per epoch but may require more memory.
CBatch size does not affect training speed; it only changes model accuracy.
DSmaller batch sizes always use less memory and train faster than larger batch sizes.
Attempts:
2 left
💡 Hint

Think about how processing more samples at once affects computation and memory.

Predict Output
intermediate
2:00remaining
Number of Weight Updates with Different Batch Sizes

Consider a dataset with 1000 samples. You train a model for 5 epochs with batch size 100. How many weight updates occur during training?

TensorFlow
dataset_size = 1000
batch_size = 100
epochs = 5
updates = (dataset_size // batch_size) * epochs
print(updates)
A50
B5
C500
D100
Attempts:
2 left
💡 Hint

Calculate how many batches fit in one epoch, then multiply by epochs.

Hyperparameter
advanced
2:00remaining
Choosing Batch Size for Memory Constraints

You want to train a deep neural network on a GPU with limited memory. Which batch size choice is best to avoid out-of-memory errors while maintaining reasonable training speed?

ABatch size does not affect memory usage; focus on learning rate instead.
BUse a batch size of 1 to minimize memory usage but expect slower training.
CUse the largest batch size that fits in memory without causing errors.
DUse a very large batch size to reduce the number of updates.
Attempts:
2 left
💡 Hint

Think about balancing memory limits and training efficiency.

Metrics
advanced
2:00remaining
Effect of Epochs on Model Performance Metrics

You train a model for 10 epochs and observe training accuracy improves but validation accuracy plateaus after 5 epochs. What does this indicate?

AThe model is overfitting; training longer may reduce validation accuracy.
BThe model is underfitting and needs more epochs.
CThe batch size is too large, causing poor validation accuracy.
DThe learning rate is too high, causing unstable training.
Attempts:
2 left
💡 Hint

Consider what happens when training accuracy improves but validation does not.

🔧 Debug
expert
3:00remaining
Identifying Epochs and Batch Size Impact in TensorFlow Training Loop

Given the code below, what will be the printed output for the number of batches processed per epoch?

TensorFlow
import tensorflow as tf

# Dataset with 120 samples
x = tf.random.normal([120, 10])
y = tf.random.uniform([120], maxval=2, dtype=tf.int32)

batch_size = 25
epochs = 3

dataset = tf.data.Dataset.from_tensor_slices((x, y)).batch(batch_size)

for epoch in range(epochs):
    batch_count = 0
    for batch_x, batch_y in dataset:
        batch_count += 1
    print(f"Epoch {epoch+1} batches: {batch_count}")
A
Epoch 1 batches: 6
Epoch 2 batches: 6
Epoch 3 batches: 6
B
Epoch 1 batches: 4
Epoch 2 batches: 4
Epoch 3 batches: 4
C
Epoch 1 batches: 3
Epoch 2 batches: 3
Epoch 3 batches: 3
D
Epoch 1 batches: 5
Epoch 2 batches: 5
Epoch 3 batches: 5
Attempts:
2 left
💡 Hint

Calculate how many batches of size 25 fit into 120 samples.