Recall & Review
beginner
What is the main advantage of using a GPU for machine learning inference?
GPUs can process many operations in parallel, making them faster for large, complex models during inference.
Click to reveal answer
beginner
Why might CPUs be preferred over GPUs for some inference tasks?
CPUs are better for smaller models or when low latency for single requests is needed, and they use less power.
Click to reveal answer
intermediate
How does batch size affect GPU inference performance?
Larger batch sizes improve GPU efficiency by using parallelism better, but increase latency per request.
Click to reveal answer
intermediate
What is a tradeoff when using GPUs for inference in terms of cost?
GPUs are more expensive to run and maintain, so they may increase operational costs compared to CPUs.
Click to reveal answer
intermediate
Explain why CPU inference might be more energy efficient than GPU inference.
CPUs consume less power for small or simple inference tasks, making them more energy efficient in those cases.
Click to reveal answer
Which hardware is generally better for large batch inference?
✗ Incorrect
GPUs excel at processing large batches in parallel, improving throughput.
What is a common reason to choose CPU over GPU for inference?
✗ Incorrect
CPUs handle single requests with lower latency better than GPUs.
How does increasing batch size affect GPU inference latency?
✗ Incorrect
Larger batches increase latency per request but improve throughput.
Which hardware typically costs more to operate for inference?
✗ Incorrect
GPUs have higher operational costs due to power and maintenance.
Why might CPU inference be more energy efficient?
✗ Incorrect
CPUs consume less power when handling small or simple inference tasks.
Describe the main tradeoffs between using GPUs and CPUs for machine learning inference.
Think about speed, cost, and power use.
You got /4 concepts.
Explain how batch size influences the choice between GPU and CPU for inference.
Consider how many requests are processed at once.
You got /3 concepts.