0
0
Linux CLIscripting~5 mins

Why compression saves storage and bandwidth in Linux CLI - Performance Analysis

Choose your learning style9 modes available
Time Complexity: Why compression saves storage and bandwidth
O(n)
Understanding Time Complexity

We want to understand how compressing files affects the time it takes to save and send data.

How does the work grow when the file size grows?

Scenario Under Consideration

Analyze the time complexity of compressing a file using gzip.

gzip largefile.txt
# This command compresses the file largefile.txt into largefile.txt.gz
# It reads the whole file, compresses it, and writes the output

This code compresses a file to save space and reduce data sent over a network.

Identify Repeating Operations

Look at what repeats as the file size grows.

  • Primary operation: Reading and processing each byte of the file to compress it.
  • How many times: Once for every byte in the file (the whole file is processed).
How Execution Grows With Input

As the file gets bigger, the time to compress grows roughly in direct proportion.

Input Size (n)Approx. Operations
10 KBAbout 10,000 operations
100 KBAbout 100,000 operations
1 MBAbout 1,000,000 operations

Pattern observation: Doubling the file size roughly doubles the work needed.

Final Time Complexity

Time Complexity: O(n)

This means the time to compress grows linearly with the file size.

Common Mistake

[X] Wrong: "Compression time stays the same no matter the file size."

[OK] Correct: The compressor must read and process every byte, so bigger files take more time.

Interview Connect

Understanding how compression time grows helps you explain performance in real tasks like saving or sending files.

Self-Check

What if we used a compression tool that only compresses parts of the file? How would the time complexity change?