Streams help handle large amounts of data efficiently without using too much memory. They let you process data bit by bit instead of all at once.
0
0
Why streams are needed in Node.js
Introduction
Reading a large file without loading it fully into memory
Sending data over the internet in small chunks
Processing live data like video or audio streams
Handling user uploads that can be very big
Transforming data on the fly while reading or writing
Syntax
Node.js
const stream = require('stream');
// Create a readable stream
const readable = new stream.Readable({
read() {}
});
// Create a writable stream
const writable = new stream.Writable({
write(chunk, encoding, callback) {
// process chunk
callback();
}
});Streams come in four types: readable, writable, duplex (both), and transform (modify data).
Streams use events like 'data', 'end', and 'error' to handle data flow.
Examples
This reads a file piece by piece and logs the size of each chunk.
Node.js
const fs = require('fs'); // Read file as stream const readable = fs.createReadStream('bigfile.txt'); readable.on('data', chunk => { console.log('Received chunk:', chunk.length); });
This writes data to a file in parts without loading all data at once.
Node.js
const fs = require('fs'); // Write data to file as stream const writable = fs.createWriteStream('output.txt'); writable.write('Hello '); writable.write('World!'); writable.end();
Sample Program
This program reads a file in chunks and counts how many bytes it read in total. It shows how streams let you handle big files efficiently.
Node.js
const fs = require('fs'); // Stream to read a large file and count total bytes let totalBytes = 0; const readable = fs.createReadStream('sample.txt'); readable.on('data', chunk => { totalBytes += chunk.length; }); readable.on('end', () => { console.log(`Total bytes read: ${totalBytes}`); }); readable.on('error', err => { console.error('Error reading file:', err.message); });
OutputSuccess
Important Notes
Streams help keep your app fast and use less memory.
Always handle errors on streams to avoid crashes.
Streams can be piped together to connect reading and writing easily.
Summary
Streams let you work with data bit by bit, not all at once.
They are useful for big files, live data, and network communication.
Using streams saves memory and improves performance.