0
0
Node.jsframework~20 mins

Why streams are needed in Node.js - Challenge Your Understanding

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Stream Mastery Badge
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
Why use streams for large files in Node.js?
You want to read a very large file in Node.js. Why is using streams better than reading the whole file at once?
AStreams convert the file into a database for easier access.
BStreams automatically compress the file to make it smaller in memory.
CStreams load the entire file into memory but faster than other methods.
DStreams let you process the file piece by piece, so you use less memory and start working faster.
Attempts:
2 left
💡 Hint
Think about memory use when handling big files.
component_behavior
intermediate
2:00remaining
What happens when you pipe a readable stream to a writable stream?
In Node.js, you connect a readable stream to a writable stream using pipe(). What is the main effect of this connection?
Node.js
const fs = require('fs');
const readable = fs.createReadStream('input.txt');
const writable = fs.createWriteStream('output.txt');
readable.pipe(writable);
AThe entire file is copied instantly before any processing.
BThe writable stream sends data back to the readable stream.
CData flows automatically from the readable stream to the writable stream in chunks.
DThe streams merge into one stream that reads and writes at the same time.
Attempts:
2 left
💡 Hint
Consider how data moves between streams.
state_output
advanced
2:00remaining
What is the output when reading a file with streams?
Consider this Node.js code reading a file with a stream. What will be printed to the console?
Node.js
const fs = require('fs');
const readable = fs.createReadStream('file.txt', { encoding: 'utf8' });
readable.on('data', chunk => {
  console.log(chunk.length);
});
AMultiple numbers showing the length of each chunk read from the file.
BOne number showing the total length of the file.
CAn error because chunk.length is undefined.
DNo output because the stream does not emit 'data' events.
Attempts:
2 left
💡 Hint
Streams emit 'data' events multiple times for chunks.
📝 Syntax
advanced
2:00remaining
Identify the syntax error in this stream code
Which option contains a syntax error when creating a readable stream in Node.js?
Node.js
const fs = require('fs');
const stream = fs.createReadStream('data.txt', { encoding: 'utf8' });
Aconst stream = fs.createReadStream('data.txt' encoding: 'utf8');
Bconst stream = fs.createReadStream('data.txt', { encoding: 'utf8' });
Cconst stream = fs.createReadStream('data.txt', encoding = 'utf8');
Dconst stream = fs.createReadStream('data.txt', { encoding: utf8 });
Attempts:
2 left
💡 Hint
Check the commas and braces in the options.
🔧 Debug
expert
3:00remaining
Why does this stream code cause a memory leak?
This Node.js code reads a large file but causes increasing memory use until crash. What is the likely cause?
Node.js
const fs = require('fs');
const readable = fs.createReadStream('largefile.txt');
readable.on('data', chunk => {
  // process chunk but do not consume it fully
});
AThe file is too large to read with streams, so memory leaks always happen.
BThe 'data' event handler does not pause the stream, causing data to accumulate in memory.
CThe stream is missing an 'end' event listener, causing memory leaks.
DThe encoding option is missing, so chunks are not released from memory.
Attempts:
2 left
💡 Hint
Think about how streams manage flow control.