0
0
Node.jsframework~20 mins

Stream backpressure concept in Node.js - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Stream Backpressure Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
What is the main purpose of backpressure in Node.js streams?
Why do streams use backpressure in Node.js?
ATo speed up the data transfer between streams regardless of the writable stream's capacity
BTo close the stream automatically when data size exceeds a limit
CTo buffer all data in memory before writing it to the destination
DTo control the flow of data so the writable stream is not overwhelmed by the readable stream
Attempts:
2 left
💡 Hint
Think about what happens if data is sent faster than it can be processed.
component_behavior
intermediate
2:00remaining
What happens when writable.write() returns false in a Node.js stream?
In Node.js streams, if writable.write(chunk) returns false, what does it mean for the data flow?
AThe writable stream has ended and cannot accept more data
BThe chunk was written successfully and the stream is ready for more data immediately
CThe internal buffer is full; the writable stream asks the readable stream to pause sending data
DThe chunk was rejected and will be lost
Attempts:
2 left
💡 Hint
Consider what the return value of writable.write() indicates about the buffer state.
state_output
advanced
2:00remaining
What is the output of this Node.js stream code regarding backpressure?
Consider this code snippet using streams in Node.js. What will be logged to the console?
Node.js
const { Readable, Writable } = require('stream');

const readable = new Readable({
  read() {
    this.push('data');
    this.push(null);
  }
});

const writable = new Writable({
  write(chunk, encoding, callback) {
    console.log('Writing:', chunk.toString());
    callback();
  }
});

const canWriteMore = writable.write('chunk1');
console.log('Can write more after chunk1:', canWriteMore);

writable.on('drain', () => {
  console.log('Drain event fired');
});
ACan write more after chunk1: false
B
Writing: chunk1
Can write more after chunk1: true
C
Writing: chunk1
Can write more after chunk1: false
Drain event fired
D
Writing: chunk1
Drain event fired
Can write more after chunk1: true
Attempts:
2 left
💡 Hint
Check what writable.write returns when the internal buffer is empty.
🔧 Debug
advanced
2:00remaining
Why does this Node.js stream code cause a memory leak?
This code reads from a file and writes to another file using streams. Why might it cause a memory leak related to backpressure?
Node.js
const fs = require('fs');

const readable = fs.createReadStream('input.txt');
const writable = fs.createWriteStream('output.txt');

readable.on('data', (chunk) => {
  writable.write(chunk);
});
ABecause writable.write is called without checking its return value, causing the readable stream to keep emitting data even if writable buffer is full
BBecause the readable stream is not paused after the 'end' event
CBecause the writable stream is not closed after writing
DBecause the 'data' event is not removed after the first chunk
Attempts:
2 left
💡 Hint
Think about what happens if writable.write returns false but readable keeps emitting data.
📝 Syntax
expert
3:00remaining
Which option correctly implements backpressure handling in a Node.js stream pipe?
Given a readable and writable stream, which code snippet correctly handles backpressure to avoid memory issues?
A
readable.on('data', (chunk) => {
  if (!writable.write(chunk)) {
    readable.pause();
  }
});
writable.on('drain', () => {
  readable.resume();
});
B
readable.on('data', (chunk) => {
  writable.write(chunk);
});
writable.on('drain', () => {
  readable.resume();
});
C
readable.pipe(writable);
readable.pause();
writable.resume();
D
readable.pipe(writable);
writable.on('drain', () => {
  readable.pause();
});
Attempts:
2 left
💡 Hint
Backpressure requires pausing readable when writable buffer is full and resuming on drain.