0
0
Node.jsframework~20 mins

Buffer and streams relationship in Node.js - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Buffer and Streams Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
What is the role of a Buffer in Node.js streams?
In Node.js, when working with streams, what is the main purpose of a Buffer?
AIt temporarily holds chunks of data to manage flow between fast and slow streams.
BIt permanently stores all data from a stream for later use.
CIt converts streams into JSON objects automatically.
DIt encrypts data passing through the stream for security.
Attempts:
2 left
💡 Hint
Think about how data flows smoothly between parts that work at different speeds.
component_behavior
intermediate
2:00remaining
What happens when a readable stream's internal buffer is full?
Consider a readable stream in Node.js. What occurs when its internal buffer reaches its highWaterMark limit?
AThe stream discards new incoming data until space is available.
BThe stream throws an error and closes immediately.
CThe stream pauses reading from the source until the buffer is drained.
DThe stream automatically increases the buffer size without pausing.
Attempts:
2 left
💡 Hint
Think about how streams prevent memory overload by controlling reading speed.
📝 Syntax
advanced
2:00remaining
Identify the correct way to create a Buffer from a stream chunk
Given a chunk of data from a readable stream, which code correctly creates a Buffer from it in Node.js?
Node.js
stream.on('data', (chunk) => {
  // create buffer here
});
Aconst buf = new Buffer(chunk);
Bconst buf = Buffer.from(chunk);
Cconst buf = Buffer.alloc(chunk);
Dconst buf = Buffer.create(chunk);
Attempts:
2 left
💡 Hint
Use the modern, safe method to create buffers from data.
state_output
advanced
2:00remaining
What is the output when piping a readable stream to a writable stream with backpressure?
Consider this Node.js code snippet: const { Readable, Writable } = require('stream'); const readable = Readable.from(['a', 'b', 'c']); const writable = new Writable({ write(chunk, encoding, callback) { setTimeout(() => { console.log(chunk.toString()); callback(); }, 100); } }); readable.pipe(writable); What will be the order and timing of the console output?
AThe letters 'a', 'b', 'c' print in order with about 100ms delay between each.
BAll letters print immediately at once without delay.
COnly 'a' prints, then the program stops due to backpressure.
DThe letters print in reverse order: 'c', 'b', 'a'.
Attempts:
2 left
💡 Hint
Think about how backpressure slows down writing but keeps order.
🔧 Debug
expert
3:00remaining
Why does this stream pipeline cause a memory leak?
Examine this Node.js code: const fs = require('fs'); const { Transform } = require('stream'); const readStream = fs.createReadStream('largefile.txt'); const transformStream = new Transform({ transform(chunk, encoding, callback) { // process chunk callback(null, chunk); } }); readStream.pipe(transformStream); // No pipe to writable stream or data event listener Why might this code cause a memory leak?
ABecause the pipe method requires a third argument to work properly.
BBecause the transform stream is missing the flush method.
CBecause the file 'largefile.txt' does not exist, causing an error.
DBecause the readable stream's data is not consumed, its internal buffer fills up indefinitely.
Attempts:
2 left
💡 Hint
Think about what happens if data is produced but never read or consumed.