0
0
NodejsConceptBeginner · 3 min read

What Are Streams in Node.js: Explanation and Examples

Streams in Node.js are objects that let you read or write data piece by piece instead of all at once. They help handle large data efficiently by processing it in chunks, like reading a book page by page rather than the whole book at once.
⚙️

How It Works

Imagine you want to drink a large bottle of water. Instead of pouring it all at once and risking spilling, you take small sips. Streams in Node.js work similarly by handling data in small parts called chunks. This way, your program can start working on the data immediately without waiting for everything to load.

Streams come in four types: readable (to get data), writable (to send data), duplex (both read and write), and transform (modify data while passing it through). They use events to notify your program when new data is ready or when the process finishes, making data handling efficient and memory-friendly.

💻

Example

This example shows how to read a text file using a readable stream and print its content chunk by chunk.

javascript
import { createReadStream } from 'node:fs';

const stream = createReadStream('example.txt', { encoding: 'utf8' });

stream.on('data', (chunk) => {
  console.log('Received chunk:', chunk);
});

stream.on('end', () => {
  console.log('No more data.');
});

stream.on('error', (err) => {
  console.error('Error:', err.message);
});
Output
Received chunk: This is the first part of the file. Received chunk: This is the second part of the file. No more data.
🎯

When to Use

Use streams when working with large files or data that you don't want to load entirely into memory, such as video files, logs, or network data. They are perfect for real-time data processing, like reading user uploads or sending data over the internet in pieces.

Streams improve performance and reduce memory use, making your Node.js applications faster and more scalable.

Key Points

  • Streams process data in small chunks, not all at once.
  • They help handle large data efficiently without using much memory.
  • Four types: readable, writable, duplex, and transform.
  • Use events like data, end, and error to manage streams.
  • Ideal for files, network communication, and real-time data.

Key Takeaways

Streams let Node.js handle data piece by piece for better memory use.
They are essential for working with large files or continuous data.
Use readable streams to get data and writable streams to send data.
Streams emit events to signal data availability and completion.
They make applications faster and more efficient with big data.