Challenge - 5 Problems
Stream Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate2:00remaining
Why use streams instead of loading entire files?
Which of the following best explains why streams are preferred over loading an entire file into memory in Node.js?
Attempts:
2 left
💡 Hint
Think about memory use when files are very large.
✗ Incorrect
Streams process data in chunks, so they don't need to load the whole file at once. This saves memory especially for big files.
❓ component_behavior
intermediate2:00remaining
What happens when reading a large file with streams?
Given this Node.js code snippet using streams, what will be the output behavior?
Node.js
const fs = require('fs'); const stream = fs.createReadStream('largefile.txt'); stream.on('data', chunk => { console.log('Received chunk of size:', chunk.length); }); stream.on('end', () => { console.log('Finished reading file'); });
Attempts:
2 left
💡 Hint
Streams emit 'data' events multiple times as chunks arrive.
✗ Incorrect
The 'data' event fires repeatedly for each chunk. After all chunks are read, the 'end' event fires.
📝 Syntax
advanced2:00remaining
Identify the error in this stream reading code
What error will this Node.js code produce when trying to read a file using streams?
Node.js
const fs = require('fs'); const stream = fs.createReadStream('file.txt'); stream.on('data', (chunk) => { console.log(chunk.toString()); }); stream.on('finish', () => { console.log('Done reading'); });
Attempts:
2 left
💡 Hint
Check if 'finish' is the right event for reading streams.
✗ Incorrect
The 'finish' event is for writable streams. Readable streams emit 'end' when done.
❓ state_output
advanced2:00remaining
Memory usage difference between streams and full file read
If you read a 1GB file using
fs.readFile versus fs.createReadStream, what is the expected memory usage behavior?Attempts:
2 left
💡 Hint
Think about how much data is held in memory at once.
✗ Incorrect
fs.readFile loads the whole file into memory, which can be large. Streams handle small parts at a time, saving memory.
🔧 Debug
expert2:00remaining
Why does this stream pipeline hang without output?
Consider this Node.js code that reads a file and writes to another file using streams. Why does it hang without finishing?
Node.js
const fs = require('fs'); const readStream = fs.createReadStream('input.txt'); const writeStream = fs.createWriteStream('output.txt'); readStream.pipe(writeStream); // No event handlers attached
Attempts:
2 left
💡 Hint
Think about what happens if no logs or events are used.
✗ Incorrect
The pipe works and copies data, but without event handlers, you won't see output or know when it finishes. The process ends when streams close.