0
0
NodejsConceptBeginner · 3 min read

When to Use Streams in Node.js: Practical Guide and Examples

Use streams in Node.js when you need to process large amounts of data efficiently without loading it all into memory at once. Streams let you read or write data piece by piece, which is ideal for files, network communication, or any continuous data flow.
⚙️

How It Works

Streams in Node.js work like a water pipe that carries data in small chunks instead of all at once. Imagine filling a bucket by turning on a tap slowly instead of dumping a whole tank of water at once. This way, you can start using the water immediately without waiting for the entire tank to empty.

There are four main types of streams: readable (you can read data from them), writable (you can write data to them), duplex (both readable and writable), and transform (which can modify data while passing it through). Streams handle data piece by piece, which helps save memory and improves performance, especially with large files or continuous data like video or audio.

💻

Example

This example shows how to read a large text file using a readable stream and print its content chunk by chunk.

nodejs
import { createReadStream } from 'node:fs';

const stream = createReadStream('largefile.txt', { encoding: 'utf8' });

stream.on('data', (chunk) => {
  console.log('Received chunk:', chunk);
});

stream.on('end', () => {
  console.log('Finished reading file.');
});

stream.on('error', (err) => {
  console.error('Error:', err.message);
});
Output
Received chunk: (first part of file content) Received chunk: (next part of file content) ... Finished reading file.
🎯

When to Use

Use streams when working with large files or data sources that don't fit into memory easily. For example, streaming a video file to a user, processing large CSV files line by line, or handling real-time data from a network socket.

Streams are also great when you want to start processing data immediately without waiting for the entire input, such as parsing logs as they arrive or piping data between different parts of your application efficiently.

Key Points

  • Streams process data in small chunks to save memory.
  • They are useful for large files, network data, or continuous data flows.
  • Node.js provides readable, writable, duplex, and transform streams.
  • Using streams improves performance and responsiveness.

Key Takeaways

Use streams to handle large or continuous data efficiently without loading it all into memory.
Streams let you start processing data immediately as it arrives in chunks.
They are ideal for files, network communication, and real-time data processing.
Node.js streams come in readable, writable, duplex, and transform types.
Using streams improves app performance and reduces memory usage.