Challenge - 5 Problems
Stream Piping Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
❓ component_behavior
intermediate2:00remaining
What is the output of this Node.js stream piping code?
Consider the following code that reads from a readable stream, transforms data, and writes to a writable stream. What will be the final output written?
Node.js
import { Readable, Writable, Transform } from 'stream'; const readable = Readable.from(['a', 'b', 'c']); const transform = new Transform({ transform(chunk, encoding, callback) { this.push(chunk.toString().toUpperCase()); callback(); } }); let result = ''; const writable = new Writable({ write(chunk, encoding, callback) { result += chunk.toString(); callback(); } }); readable.pipe(transform).pipe(writable); writable.on('finish', () => { console.log(result); });
Attempts:
2 left
💡 Hint
Think about how the Transform stream changes the data before it reaches the writable stream.
✗ Incorrect
The readable stream emits 'a', 'b', 'c'. The transform stream converts each chunk to uppercase, so 'A', 'B', 'C' are pushed. The writable stream collects these chunks into the result string. So the final output is 'ABC'.
📝 Syntax
intermediate2:00remaining
Which option correctly pipes streams to handle errors?
You want to pipe a readable stream to a writable stream and handle errors properly. Which code snippet correctly attaches error handlers to both streams?
Attempts:
2 left
💡 Hint
Remember that pipe returns the destination stream, and error events must be handled on both streams separately.
✗ Incorrect
Option C correctly attaches error handlers to both readable and writable streams separately. Option C attaches error only to the pipe return (writable) and readable, but chaining on pipe return is less clear. Option C swaps error handlers. Option C attaches error only once, missing errors on readable.
🔧 Debug
advanced2:00remaining
Why does this piping code cause a memory leak?
Examine the code below. Why might this cause a memory leak or high memory usage?
Node.js
import { Readable, Writable } from 'stream'; const readable = new Readable({ read() { this.push('data'); } }); const writable = new Writable({ write(chunk, encoding, callback) { setTimeout(() => { callback(); }, 1000); } }); readable.pipe(writable);
Attempts:
2 left
💡 Hint
Think about how the readable stream signals the end of data.
✗ Incorrect
The readable stream's read method pushes 'data' but never pushes null to signal end. This causes the stream to keep pushing data endlessly, filling memory. The writable stream's delayed callback causes backpressure but does not cause a leak alone.
❓ state_output
advanced2:00remaining
What is the value of 'count' after this stream piping completes?
This code counts how many chunks pass through a transform stream. What is the final value of 'count' after piping completes?
Node.js
import { Readable, Writable, Transform } from 'stream'; const readable = Readable.from(['x', 'y', 'z']); let count = 0; const transform = new Transform({ transform(chunk, encoding, callback) { count++; this.push(chunk); callback(); } }); const writable = new Writable({ write(chunk, encoding, callback) { callback(); } }); readable.pipe(transform).pipe(writable); writable.on('finish', () => { console.log(count); });
Attempts:
2 left
💡 Hint
Count increments once per chunk transformed.
✗ Incorrect
The readable stream emits 3 chunks: 'x', 'y', 'z'. The transform stream increments count for each chunk, so count becomes 3.
🧠 Conceptual
expert2:00remaining
Which option best describes the behavior of stream piping with backpressure?
In Node.js streams, what happens when the writable stream is slower than the readable stream during piping?
Attempts:
2 left
💡 Hint
Think about how Node.js manages flow control between streams.
✗ Incorrect
Node.js streams implement backpressure: if the writable stream is slow, the readable stream pauses to avoid buffering too much data and memory overflow.