0
0
PyTorchml~20 mins

__getitem__ and __len__ in PyTorch - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Dataset Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
Output of __getitem__ in a PyTorch Dataset
Consider this PyTorch Dataset class. What will be the output of dataset[2]?
PyTorch
import torch
from torch.utils.data import Dataset

class SimpleDataset(Dataset):
    def __init__(self):
        self.data = [10, 20, 30, 40, 50]
    def __len__(self):
        return len(self.data)
    def __getitem__(self, idx):
        return self.data[idx] * 2

dataset = SimpleDataset()
output = dataset[2]
A30
B60
C40
DIndexError
Attempts:
2 left
💡 Hint
Remember __getitem__ returns data at index multiplied by 2.
Model Choice
intermediate
2:00remaining
Choosing __len__ for a Custom Dataset
You have a dataset class with a list of 1000 images. Which implementation of __len__ correctly returns the dataset size?
A
def __len__(self):
    return 1000
B
def __len__(self):
    return self.images[0]
C
def __len__(self):
    return len(self.images)
D
def __len__(self):
    return self.images
Attempts:
2 left
💡 Hint
The length should reflect the number of items in the images list.
🔧 Debug
advanced
2:00remaining
Debugging __getitem__ Index Error
This Dataset class raises an IndexError when accessing dataset[5]. Why?
PyTorch
class MyDataset:
    def __init__(self):
        self.data = [1, 2, 3, 4, 5]
    def __len__(self):
        return len(self.data)
    def __getitem__(self, idx):
        return self.data[idx + 1]

dataset = MyDataset()
output = dataset[5]
ABecause data list is empty
BBecause __len__ returns wrong length
CBecause __getitem__ returns a list instead of a number
DBecause __getitem__ accesses index 6 which is out of range
Attempts:
2 left
💡 Hint
Check how the index is used inside __getitem__.
Hyperparameter
advanced
2:00remaining
Effect of __len__ on DataLoader Batching
If a Dataset's __len__ returns 50 but actually contains 100 samples, what happens when using a DataLoader with batch size 10?
ADataLoader will load only 50 samples in 5 batches
BDataLoader will load all 100 samples in 10 batches
CDataLoader will raise a ValueError due to mismatch
DDataLoader will load 100 samples but batch size will be ignored
Attempts:
2 left
💡 Hint
DataLoader uses __len__ to know dataset size.
🧠 Conceptual
expert
2:00remaining
Why Implement Both __getitem__ and __len__ in PyTorch Dataset?
Why is it important to implement both __getitem__ and __len__ methods in a PyTorch Dataset class?
ABecause DataLoader uses __len__ to know dataset size and __getitem__ to fetch samples by index
BBecause __getitem__ initializes the dataset and __len__ trains the model
CBecause __len__ loads data and __getitem__ shuffles it
DBecause __len__ is used only for printing and __getitem__ is optional
Attempts:
2 left
💡 Hint
Think about how DataLoader interacts with Dataset.