Complete the code to load an ONNX model using ONNX Runtime.
import onnxruntime as ort session = ort.InferenceSession([1])
The InferenceSession constructor expects the path to the ONNX model as a string.
Complete the code to prepare input data as a dictionary for ONNX Runtime inference.
import numpy as np input_name = session.get_inputs()[0].name input_data = np.array([[1.0, 2.0, 3.0]], dtype=np.float32) inputs = [1]
ONNX Runtime expects inputs as a dictionary mapping input names to numpy arrays.
Fix the error in the code to run inference and get the output.
outputs = session.run(None, [1]) print(outputs[0])
The run method requires a dictionary of inputs, not just the data array or input name.
Fill both blanks to get the name of the first output and print its shape.
output_name = session.get_outputs()[[1]].name print(session.run([output_name], [2])[0].shape)
The first output is at index 0, and the inputs dictionary must be passed to run.
Fill all three blanks to run inference on multiple inputs and get the second output.
input_names = [inp.name for inp in session.get_inputs()] inputs = {name: np.random.rand(1, 3).astype(np.float32) for name in input_names} outputs = session.run([session.get_outputs()[[1]].name], [2]) print(outputs[0].shape) # Access input for first input name first_input = inputs[[3]]
The second output is at index 1, inputs dictionary is passed to run, and the first input name is input_names[0].