0
0
GoConceptBeginner · 3 min read

Worker Pool Pattern in Go: What It Is and How It Works

The worker pool pattern in Go is a way to manage multiple goroutines (workers) that process tasks from a shared job queue concurrently. It helps control the number of active workers to efficiently use resources and avoid overload.
⚙️

How It Works

Imagine you have many tasks to do, like packing boxes in a warehouse. Instead of one person doing all the work, you have a team of workers. Each worker picks a box from a shared pile and packs it. When done, they pick the next box until all are packed.

In Go, the worker pool pattern works the same way. You create a fixed number of goroutines (workers) that wait for tasks on a shared channel (job queue). Each worker takes a task, processes it, and then waits for the next one. This setup balances the workload and prevents creating too many goroutines that could slow down the program.

💻

Example

This example shows a worker pool with 3 workers processing 5 jobs. Each job prints its number and the worker handling it.

go
package main

import (
	"fmt"
	"sync"
	"time"
)

func worker(id int, jobs <-chan int, wg *sync.WaitGroup) {
	defer wg.Done()
	for j := range jobs {
		fmt.Printf("Worker %d started job %d\n", id, j)
		time.Sleep(time.Second) // simulate work
		fmt.Printf("Worker %d finished job %d\n", id, j)
	}
}

func main() {
	jobs := make(chan int, 5)
	var wg sync.WaitGroup

	// Start 3 workers
	for w := 1; w <= 3; w++ {
		wg.Add(1)
		go worker(w, jobs, &wg)
	}

	// Send 5 jobs
	for j := 1; j <= 5; j++ {
		jobs <- j
	}
	close(jobs) // no more jobs

	wg.Wait() // wait for all workers to finish
}
Output
Worker 1 started job 1 Worker 2 started job 2 Worker 3 started job 3 Worker 1 finished job 1 Worker 1 started job 4 Worker 2 finished job 2 Worker 2 started job 5 Worker 3 finished job 3 Worker 1 finished job 4 Worker 2 finished job 5
🎯

When to Use

Use the worker pool pattern when you have many tasks to process concurrently but want to limit how many run at the same time. This helps avoid using too much memory or CPU.

Common cases include handling web requests, processing files, or running background jobs where tasks can be done independently but need controlled concurrency.

Key Points

  • Worker pool limits the number of active goroutines to control resource use.
  • Workers receive tasks from a shared channel and process them independently.
  • It improves efficiency and prevents overload in concurrent programs.
  • Use synchronization like sync.WaitGroup to wait for all workers to finish.

Key Takeaways

The worker pool pattern manages concurrency by using a fixed number of goroutines to process tasks from a shared queue.
It helps control resource usage and improves program efficiency by limiting active workers.
Use channels to send tasks and sync.WaitGroup to wait for all workers to complete.
Ideal for workloads with many independent tasks needing controlled parallel processing.