Brotli compression in Nginx - Time & Space Complexity
We want to understand how the time to compress data with Brotli in nginx changes as the data size grows.
How does the work needed to compress data increase when the input gets bigger?
Analyze the time complexity of the following nginx Brotli compression configuration snippet.
brotli on;
brotli_comp_level 5;
brotli_types text/html text/css application/javascript;
brotli_min_length 100;
This snippet enables Brotli compression at level 5 for certain content types and sets a minimum size to compress.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: The Brotli compression algorithm processes the input data by scanning and encoding it.
- How many times: It processes each byte of the input data once or multiple times depending on compression level.
As the input size grows, the compression work grows roughly in proportion to the data size, but higher compression levels do more work per byte.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 KB | Low thousands of operations |
| 100 KB | About 10 times more operations |
| 1 MB | About 100 times more operations |
Pattern observation: The work grows roughly linearly with input size, but compression level affects the constant factor.
Time Complexity: O(n)
This means the time to compress grows roughly in direct proportion to the size of the input data.
[X] Wrong: "Compression time stays the same no matter how big the file is."
[OK] Correct: Larger files require more processing, so compression time increases with file size.
Understanding how compression time scales helps you design efficient servers and troubleshoot performance issues confidently.
"What if we increase the brotli_comp_level from 5 to 9? How would the time complexity change?"