In serverless environments like Next.js API routes, why is connection pooling important?
Think about how serverless functions start and stop frequently and how databases limit connections.
Serverless functions can open many connections quickly, causing limits to be reached. Connection pooling reuses connections, reducing overhead and improving performance.
Consider a Next.js API route that opens a new database connection on every request without pooling. What is the most likely outcome under heavy traffic?
Think about how many connections a database can handle and what happens if too many open at once.
Without pooling, each request opens a new connection. Under heavy load, this exhausts the database's connection limit, causing errors.
Which code snippet correctly implements connection pooling using a global variable to reuse a single database client instance in Next.js API routes?
import { Client } from 'pg'; let client; export default async function handler(req, res) { if (!client) { client = new Client({ connectionString: process.env.DATABASE_URL }); await client.connect(); } const result = await client.query('SELECT NOW()'); res.status(200).json({ time: result.rows[0].now }); }
Look for where the client is created and if it is reused across requests.
Storing the client in a global variable and connecting once allows reuse across serverless invocations, implementing pooling.
Given this code, why might the app throw a 'too many clients' error under load?
import { Pool } from 'pg'; export default async function handler(req, res) { const pool = new Pool({ connectionString: process.env.DATABASE_URL }); const client = await pool.connect(); const result = await client.query('SELECT NOW()'); client.release(); res.status(200).json({ time: result.rows[0].now }); }
Check where the Pool is created and how often.
Creating a new Pool on every request defeats pooling because each pool manages its own connections, quickly exhausting limits.
Consider this Next.js API route code. What will be the output when 3 requests arrive almost simultaneously?
import { Pool } from 'pg'; const pool = new Pool({ connectionString: process.env.DATABASE_URL, max: 2 }); export default async function handler(req, res) { const client = await pool.connect(); const result = await client.query('SELECT pg_sleep(1); SELECT NOW()'); client.release(); res.status(200).json({ time: result[1].rows[0].now }); }
Think about the pool max connections and how pg_sleep delays queries.
The pool allows only 2 connections. Two requests use them and sleep 1 second. The third waits for a free connection, so it finishes after about 2 seconds.