0
0
NodejsConceptBeginner · 3 min read

Readable Stream in Node.js: What It Is and How It Works

A readable stream in Node.js is an object that lets you read data piece by piece instead of all at once. It helps handle large data efficiently by emitting chunks of data that you can process as they arrive.
⚙️

How It Works

Think of a readable stream like a water faucet that drips water slowly instead of pouring it all at once. Instead of loading a whole file or data source into memory, Node.js reads it in small chunks. This way, your program can start working on the data immediately without waiting for everything to load.

Under the hood, a readable stream emits events like data when a chunk is ready, and end when all data has been read. You can listen to these events to process data as it flows in, making your app faster and more memory-friendly.

💻

Example

This example shows how to read a file using a readable stream and print its content chunk by chunk.

javascript
import { createReadStream } from 'node:fs';

const stream = createReadStream('example.txt', { encoding: 'utf8' });

stream.on('data', (chunk) => {
  console.log('Received chunk:', chunk);
});

stream.on('end', () => {
  console.log('No more data.');
});

stream.on('error', (err) => {
  console.error('Error:', err.message);
});
Output
Received chunk: (first part of file content) Received chunk: (next part of file content) ... No more data.
🎯

When to Use

Use readable streams when working with large files, network responses, or any data source that can be read bit by bit. This helps avoid loading huge data all at once, which can slow down or crash your app.

Common cases include reading big files, processing data from APIs, or handling real-time data like video or audio streams.

Key Points

  • Readable streams let you read data in small chunks.
  • They emit data, end, and error events.
  • They help manage memory efficiently for large data.
  • Commonly used for files, network, and real-time data.

Key Takeaways

Readable streams allow efficient, chunk-by-chunk data reading in Node.js.
They emit events to handle data flow and errors smoothly.
Use them to process large files or continuous data without high memory use.
Listening to stream events lets your app start working before all data loads.