When to use blocking (combinational) in Verilog - Time & Space Complexity
We want to understand how the time it takes to run combinational logic with blocking assignments changes as the input size grows.
How does the number of operations grow when using blocking assignments in combinational blocks?
Analyze the time complexity of the following combinational logic using blocking assignments.
always @(*) begin
for (int i = 0; i < N; i = i + 1) begin
out[i] = in1[i] & in2[i];
end
end
This code performs a bitwise AND on two input arrays and stores the result in an output array using blocking assignments inside a combinational block.
Look for loops or repeated actions that affect execution time.
- Primary operation: The for-loop that processes each bit of the input arrays.
- How many times: It runs exactly N times, once for each element.
As the input size N grows, the number of operations grows proportionally.
| Input Size (N) | Approx. Operations |
|---|---|
| 10 | 10 operations |
| 100 | 100 operations |
| 1000 | 1000 operations |
Pattern observation: The operations increase linearly with input size.
Time Complexity: O(N)
This means the time to complete the combinational logic grows directly in proportion to the input size.
[X] Wrong: "Using blocking assignments in combinational logic makes execution constant time regardless of input size."
[OK] Correct: Each element still needs to be processed one by one, so time grows with input size.
Understanding how combinational logic scales helps you design efficient hardware and explain your design choices clearly in interviews.
"What if we replaced the for-loop with nested loops processing a 2D array? How would the time complexity change?"