0
0
Node.jsframework~20 mins

Piping streams together in Node.js - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Stream Piping Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
component_behavior
intermediate
2:00remaining
What is the output of this Node.js stream piping code?
Consider the following code that reads from a readable stream, transforms data, and writes to a writable stream. What will be the final output written?
Node.js
import { Readable, Writable, Transform } from 'stream';

const readable = Readable.from(['a', 'b', 'c']);

const transform = new Transform({
  transform(chunk, encoding, callback) {
    this.push(chunk.toString().toUpperCase());
    callback();
  }
});

let result = '';
const writable = new Writable({
  write(chunk, encoding, callback) {
    result += chunk.toString();
    callback();
  }
});

readable.pipe(transform).pipe(writable);

writable.on('finish', () => {
  console.log(result);
});
A"ABC"
B"abc"
C"a b c"
D"undefined"
Attempts:
2 left
💡 Hint
Think about how the Transform stream changes the data before it reaches the writable stream.
📝 Syntax
intermediate
2:00remaining
Which option correctly pipes streams to handle errors?
You want to pipe a readable stream to a writable stream and handle errors properly. Which code snippet correctly attaches error handlers to both streams?
Areadable.pipe(writable).on('error', err => console.error('Writable error:', err)); readable.on('error', err => console.error('Readable error:', err));
Breadable.pipe(writable).on('error', err => console.error('Readable error:', err)); writable.on('error', err => console.error('Writable error:', err));
Creadable.pipe(writable); readable.on('error', err => console.error('Readable error:', err)); writable.on('error', err => console.error('Writable error:', err));
Dreadable.pipe(writable).on('error', err => console.error('Stream error:', err));
Attempts:
2 left
💡 Hint
Remember that pipe returns the destination stream, and error events must be handled on both streams separately.
🔧 Debug
advanced
2:00remaining
Why does this piping code cause a memory leak?
Examine the code below. Why might this cause a memory leak or high memory usage?
Node.js
import { Readable, Writable } from 'stream';

const readable = new Readable({
  read() {
    this.push('data');
  }
});

const writable = new Writable({
  write(chunk, encoding, callback) {
    setTimeout(() => {
      callback();
    }, 1000);
  }
});

readable.pipe(writable);
AThe readable stream never signals end, so it keeps pushing data endlessly, causing memory to grow.
BThe readable stream pushes data synchronously, which is not allowed and causes memory issues.
CThe pipe method is missing an error handler, causing unhandled errors and memory leaks.
DThe writable stream's callback is delayed, causing backpressure to build up and memory to increase.
Attempts:
2 left
💡 Hint
Think about how the readable stream signals the end of data.
state_output
advanced
2:00remaining
What is the value of 'count' after this stream piping completes?
This code counts how many chunks pass through a transform stream. What is the final value of 'count' after piping completes?
Node.js
import { Readable, Writable, Transform } from 'stream';

const readable = Readable.from(['x', 'y', 'z']);

let count = 0;
const transform = new Transform({
  transform(chunk, encoding, callback) {
    count++;
    this.push(chunk);
    callback();
  }
});

const writable = new Writable({
  write(chunk, encoding, callback) {
    callback();
  }
});

readable.pipe(transform).pipe(writable);

writable.on('finish', () => {
  console.log(count);
});
Aundefined
B0
C1
D3
Attempts:
2 left
💡 Hint
Count increments once per chunk transformed.
🧠 Conceptual
expert
2:00remaining
Which option best describes the behavior of stream piping with backpressure?
In Node.js streams, what happens when the writable stream is slower than the readable stream during piping?
AThe readable stream continues pushing data regardless, causing the writable stream to buffer indefinitely.
BThe readable stream pauses automatically to wait for the writable stream to catch up, preventing memory overflow.
CThe writable stream speeds up to match the readable stream's pace, so no data is lost.
DThe pipe method throws an error if the writable stream is slower than the readable stream.
Attempts:
2 left
💡 Hint
Think about how Node.js manages flow control between streams.