Providing context and constraints in AI for Everyone - Time & Space Complexity
When we give context and constraints to an AI, it affects how much work the AI does to understand and respond.
We want to see how the AI's effort grows as the context or constraints get bigger.
Analyze the time complexity of the following AI processing steps.
function processInput(context, constraints, userInput) {
let combined = context + constraints + userInput;
let tokens = tokenize(combined);
let response = generateResponse(tokens);
return response;
}
This code combines context, constraints, and user input, then processes them to generate a response.
Look for repeated steps that depend on input size.
- Primary operation: Tokenizing the combined input text.
- How many times: Once per combined input, which grows as context and constraints grow.
As the total text length increases, the number of tokens to process grows roughly in the same way.
| Input Size (tokens) | Approx. Operations |
|---|---|
| 10 | 10 token processing steps |
| 100 | 100 token processing steps |
| 1000 | 1000 token processing steps |
Pattern observation: The work grows directly with the total input size.
Time Complexity: O(n)
This means the AI's processing time grows in a straight line as the input gets longer.
[X] Wrong: "Adding more context or constraints won't affect processing time much."
[OK] Correct: More context means more text to analyze, so the AI must do more work, increasing processing time.
Understanding how input size affects AI processing helps you explain performance in real projects and shows you think about efficiency clearly.
"What if the AI cached parts of the context so it didn't reprocess them every time? How would that change the time complexity?"