0
0
Computer Visionml~20 mins

ONNX Runtime in Computer Vision - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
ONNX Runtime Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
ONNX Runtime Model Inference Output
What is the output shape of the prediction when running this ONNX Runtime code snippet on a model expecting input shape (1, 3, 224, 224)?
Computer Vision
import onnxruntime as ort
import numpy as np

session = ort.InferenceSession('model.onnx')
input_name = session.get_inputs()[0].name
input_data = np.random.randn(1, 3, 224, 224).astype(np.float32)
outputs = session.run(None, {input_name: input_data})
prediction = outputs[0]
print(prediction.shape)
A(1, 1000)
B(224, 224, 3)
C(3, 224, 224)
D(1000, 1)
Attempts:
2 left
💡 Hint
The model outputs class probabilities for 1000 classes in a batch of 1.
Model Choice
intermediate
1:30remaining
Choosing ONNX Runtime Execution Provider
Which ONNX Runtime execution provider should you choose to maximize inference speed on a machine with an NVIDIA GPU?
ACPUExecutionProvider
BCUDAExecutionProvider
CTensorRTExecutionProvider
DOpenVINOExecutionProvider
Attempts:
2 left
💡 Hint
NVIDIA GPUs are best supported by a specific provider.
Hyperparameter
advanced
1:30remaining
Batch Size Effect on ONNX Runtime Performance
Increasing the batch size during ONNX Runtime inference usually has which effect on throughput and latency?
AThroughput increases, latency increases
BThroughput decreases, latency decreases
CThroughput decreases, latency increases
DThroughput increases, latency decreases
Attempts:
2 left
💡 Hint
Think about processing more inputs at once versus time per input.
🔧 Debug
advanced
2:00remaining
ONNX Runtime Input Type Error
What error will this ONNX Runtime code raise when passing input data as a Python list instead of a numpy array?
Computer Vision
import onnxruntime as ort

session = ort.InferenceSession('model.onnx')
input_name = session.get_inputs()[0].name
input_data = [[0.5]*224]*224  # Python list, not numpy array
outputs = session.run(None, {input_name: input_data})
AValueError: Input shape mismatch
BNo error, runs successfully
CRuntimeError: Model file not found
DTypeError: Expected numpy.ndarray for input
Attempts:
2 left
💡 Hint
ONNX Runtime expects inputs as numpy arrays with correct dtype.
🧠 Conceptual
expert
2:30remaining
ONNX Runtime Model Optimization
Which ONNX Runtime feature allows you to optimize a model by fusing nodes and eliminating redundant operations to improve inference speed?
AExecution Provider Selection
BDynamic Batching
CGraph Optimization Level
DModel Quantization
Attempts:
2 left
💡 Hint
This feature works on the model graph before running inference.