Thread safety concepts in Ruby - Time & Space Complexity
When working with threads, it is important to understand how operations behave when multiple threads run at the same time.
We want to know how the time to complete tasks changes when threads share data or resources.
Analyze the time complexity of this Ruby code using threads and a shared counter.
count = 0
mutex = Mutex.new
threads = []
10.times do
threads << Thread.new do
1000.times do
mutex.synchronize do
count += 1
end
end
end
end
threads.each(&:join)
This code creates 10 threads, each adding 1 to a shared counter 1000 times, using a mutex to avoid conflicts.
Look at the loops and synchronization points that repeat work.
- Primary operation: Incrementing the shared counter inside a mutex lock.
- How many times: 10 threads x 1000 increments = 10,000 times total.
As the number of threads or increments grows, the total operations increase proportionally.
| Input Size (threads x increments) | Approx. Operations |
|---|---|
| 10 x 1000 = 10,000 | 10,000 increments with locking |
| 20 x 1000 = 20,000 | 20,000 increments with locking |
| 10 x 2000 = 20,000 | 20,000 increments with locking |
Pattern observation: The total work grows directly with the number of increments and threads combined.
Time Complexity: O(n)
This means the time to finish grows in a straight line with the total number of increments done by all threads.
[X] Wrong: "Using threads always makes the program run faster without extra cost."
[OK] Correct: Synchronizing shared data with locks adds waiting time, so more threads can sometimes slow things down.
Understanding how thread safety affects time helps you write programs that work well when many things happen at once.
"What if we removed the mutex lock? How would the time complexity and program behavior change?"