Which statement best describes how increasing the batch size affects the training speed of a neural network?
Think about how processing multiple samples at once affects computation.
Using a larger batch size allows the model to process more samples simultaneously, which often speeds up training per epoch due to parallel computation on hardware like GPUs.
What is the shape of the output tensor after passing a batch of images through a model?
Assume the model outputs a tensor of shape (batch_size, 10) for 10 classes.
import torch batch_size = 16 model_output = torch.randn(batch_size, 10) print(model_output.shape)
The first dimension is the batch size, the second is the number of classes.
The output shape is (batch_size, number_of_classes). Here batch_size is 16 and classes are 10, so shape is (16, 10).
You have a GPU with limited memory. Which batch size choice is best to avoid out-of-memory errors while training a large model?
Think about balancing memory use and training efficiency.
Choosing a batch size that fits in memory avoids errors and keeps training efficient. Too small slows training, too large causes errors.
How does enabling shuffling in the data loader affect training metrics like loss and accuracy?
Consider how randomizing data order affects learning.
Shuffling data helps the model see varied examples each batch, reducing bias and improving generalization, which often improves training metrics.
What error will this PyTorch DataLoader code raise?
from torch.utils.data import DataLoader, TensorDataset
import torch
data = torch.randn(100, 3)
targets = torch.randint(0, 2, (100,))
dataset = TensorDataset(data, targets)
loader = DataLoader(dataset, batch_size=10, shuffle=False)
for batch_data, batch_targets in loader:
print(batch_data.shape, batch_targets.shape)
loader = DataLoader(dataset, batch_size=10, shuffle='yes') # Incorrect shuffle argumentCheck the type expected for the shuffle argument.
The shuffle argument must be True or False. Passing a string like 'yes' causes a TypeError.