Benefits and challenges of multithreading in Operating Systems - Time & Space Complexity
When using multithreading, we want to understand how the time to complete tasks changes as we add more threads.
We ask: Does adding threads always make things faster, and what costs come with it?
Analyze the time complexity of this simple multithreaded task execution.
for each thread in threads:
start thread to run task
wait for all threads to finish
This code runs the same task in multiple threads simultaneously and waits for all to complete.
Look at what repeats and what costs time.
- Primary operation: Running the task in each thread.
- How many times: Once per thread, all running at the same time.
As we add more threads, the total work is split, but overhead also grows.
| Number of Threads (n) | Approx. Time |
|---|---|
| 1 | Full task time |
| 10 | About 1/10 task time + overhead |
| 100 | Much less task time but more overhead and possible delays |
Pattern observation: More threads can reduce task time but overhead and resource limits slow gains.
Time Complexity: O(task_time / n + overhead)
This means total time decreases roughly by the number of threads but extra costs limit perfect speedup.
[X] Wrong: "Adding more threads always makes the program run faster."
[OK] Correct: More threads add overhead and can cause delays if resources like CPU or memory are limited.
Understanding how multithreading affects time helps you explain real-world program behavior and shows you grasp practical system limits.
What if the tasks in each thread depend on each other? How would that affect the time complexity?