What is a neural network (simplified) in AI for Everyone - Complexity Analysis
When learning about neural networks, it's helpful to understand how their work grows as they process more data.
We want to know how the time needed changes when the network gets bigger or sees more information.
Analyze the time complexity of the following simple neural network forward pass.
inputs = [x1, x2, x3] // input values
weights = [[w11, w12, w13], [w21, w22, w23]] // weights for 2 neurons
outputs = []
for neuron_weights in weights:
total = 0
for i in range(len(inputs)):
total += inputs[i] * neuron_weights[i]
outputs.append(total)
This code calculates outputs for 2 neurons, each connected to 3 inputs, by multiplying and adding values.
Look at the loops that repeat work:
- Primary operation: Multiplying each input by its weight and adding to total.
- How many times: For each neuron, it repeats for every input.
As the number of neurons or inputs grows, the work grows too.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 neurons × 10 inputs | 100 multiplications and additions |
| 100 neurons × 100 inputs | 10,000 multiplications and additions |
| 1000 neurons × 1000 inputs | 1,000,000 multiplications and additions |
Pattern observation: The total work grows by multiplying the number of neurons by the number of inputs.
Time Complexity: O(n × m)
This means the time needed grows proportionally to the number of neurons times the number of inputs.
[X] Wrong: "The time only depends on the number of neurons or only on the number of inputs."
[OK] Correct: Both neurons and inputs matter because each neuron processes every input, so time grows with both.
Understanding how neural network size affects processing time helps you explain performance in real AI tasks clearly and confidently.
"What if we added a third loop for multiple layers? How would the time complexity change?"