Practice - 5 Tasks
Answer the questions below
1fill in blank
easyComplete the code to set the number of workers for parallel data loading in PyTorch.
PyTorch
train_loader = DataLoader(dataset, batch_size=32, shuffle=True, num_workers=[1])
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using negative numbers for num_workers causes errors.
Setting num_workers to None is invalid.
✗ Incorrect
Setting num_workers=0 means data loading happens in the main process without parallel workers.
2fill in blank
mediumComplete the code to enable parallel data loading with 2 workers in PyTorch.
PyTorch
train_loader = DataLoader(dataset, batch_size=64, shuffle=True, num_workers=[1])
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Setting num_workers to 0 disables parallel loading.
Using too many workers can slow down the system.
✗ Incorrect
Setting num_workers=2 uses two subprocesses to load data in parallel, speeding up training.
3fill in blank
hardFix the error in the code by choosing the correct num_workers value for DataLoader.
PyTorch
loader = DataLoader(dataset, batch_size=16, shuffle=False, num_workers=[1])
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Using negative numbers like -2 causes errors.
Passing None is not accepted.
✗ Incorrect
Negative values other than zero are invalid for num_workers. Zero disables parallel loading safely.
4fill in blank
hardFill both blanks to create a DataLoader with batch size 128 and 6 parallel workers.
PyTorch
loader = DataLoader(dataset, batch_size=[1], shuffle=True, num_workers=[2])
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Confusing batch size with num_workers values.
Using too many workers causing slowdowns.
✗ Incorrect
Batch size is set to 128 and num_workers to 6 for parallel loading.
5fill in blank
hardFill all three blanks to create a DataLoader with batch size 256, shuffle disabled, and 8 workers.
PyTorch
loader = DataLoader(dataset, batch_size=[1], shuffle=[2], num_workers=[3])
Drag options to blanks, or click blank then click option'
Attempts:
3 left
💡 Hint
Common Mistakes
Setting shuffle to True when it should be False.
Using invalid types for batch size or num_workers.
✗ Incorrect
Batch size is 256, shuffle is False to disable shuffling, and num_workers is 8 for parallel loading.