OAuth 2.0 and OpenID Connect in Cybersecurity - Time & Space Complexity
When working with OAuth 2.0 and OpenID Connect, it is important to understand how the time to process authentication and authorization requests grows as more users or tokens are involved.
We want to know how the system's work increases when handling more requests or tokens.
Analyze the time complexity of the following simplified token validation process.
function validateToken(token, tokenStore) {
for (let storedToken of tokenStore) {
if (storedToken.id === token.id) {
return storedToken.isValid;
}
}
return false;
}
This code checks if a given token exists and is valid by searching through a list of stored tokens.
Look for repeated steps that take time as input grows.
- Primary operation: Looping through the list of stored tokens to find a match.
- How many times: Up to once for each token in the store until a match is found or the list ends.
As the number of stored tokens increases, the time to find a token grows roughly in direct proportion.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | Up to 10 checks |
| 100 | Up to 100 checks |
| 1000 | Up to 1000 checks |
Pattern observation: The work grows steadily as the list gets longer, roughly one check per token.
Time Complexity: O(n)
This means the time to validate a token grows linearly with the number of stored tokens.
[X] Wrong: "Token validation time stays the same no matter how many tokens are stored."
[OK] Correct: Because the code checks tokens one by one, more tokens mean more checks, so time grows with the list size.
Understanding how token validation scales helps you explain real-world system behavior and design better authentication flows.
"What if the tokenStore was changed from a list to a hash map? How would the time complexity change?"