Channel synchronization in Go - Time & Space Complexity
When using channels in Go, it's important to know how the program's speed changes as more data passes through. We want to see how waiting for communication affects the time it takes.
How does the program's running time grow when more messages are sent and received through channels?
Analyze the time complexity of the following code snippet.
package main
func main() {
ch := make(chan int)
go func() {
for i := 0; i < 1000; i++ {
ch <- i
}
close(ch)
}()
for val := range ch {
_ = val
}
}
This code sends 1000 integers through a channel from one goroutine to the main goroutine, which receives them one by one.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Sending and receiving each integer through the channel.
- How many times: Exactly 1000 times, once per integer.
Each message sent must be received, so the total work grows directly with the number of messages.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 10 sends and 10 receives |
| 100 | About 100 sends and 100 receives |
| 1000 | About 1000 sends and 1000 receives |
Pattern observation: The total steps increase evenly as the number of messages grows.
Time Complexity: O(n)
This means the time to send and receive messages grows in a straight line with the number of messages.
[X] Wrong: "Channels add a fixed delay, so time stays the same no matter how many messages are sent."
[OK] Correct: Each message requires a send and receive step, so more messages mean more total time.
Understanding how channel communication scales helps you write efficient concurrent programs and explain your reasoning clearly in interviews.
"What if we used a buffered channel with capacity 1000 instead of an unbuffered channel? How would the time complexity change?"