0
0
Node.jsframework~20 mins

Streams vs loading entire file in memory in Node.js - Practice Questions

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Stream Mastery
Get all challenges correct to earn this badge!
Test your skills under time pressure!
🧠 Conceptual
intermediate
2:00remaining
Why use streams instead of loading entire files?
Which of the following best explains why streams are preferred over loading an entire file into memory in Node.js?
AStreams allow processing data piece by piece, reducing memory usage for large files.
BStreams automatically compress files to save disk space.
CLoading entire files is faster and uses less CPU than streams.
DStreams require the entire file to be loaded before processing starts.
Attempts:
2 left
💡 Hint
Think about memory use when files are very large.
component_behavior
intermediate
2:00remaining
What happens when reading a large file with streams?
Given this Node.js code snippet using streams, what will be the output behavior?
Node.js
const fs = require('fs');
const stream = fs.createReadStream('largefile.txt');
stream.on('data', chunk => {
  console.log('Received chunk of size:', chunk.length);
});
stream.on('end', () => {
  console.log('Finished reading file');
});
AReads the entire file at once and logs one chunk size.
BLogs 'Finished reading file' immediately, then logs chunk sizes.
CThrows an error because 'data' event is not supported on streams.
DLogs multiple 'Received chunk of size:' messages as data arrives, then logs 'Finished reading file'.
Attempts:
2 left
💡 Hint
Streams emit 'data' events multiple times as chunks arrive.
📝 Syntax
advanced
2:00remaining
Identify the error in this stream reading code
What error will this Node.js code produce when trying to read a file using streams?
Node.js
const fs = require('fs');
const stream = fs.createReadStream('file.txt');
stream.on('data', (chunk) => {
  console.log(chunk.toString());
});
stream.on('finish', () => {
  console.log('Done reading');
});
ANo error; logs file content and 'Done reading'.
BSyntaxError due to missing semicolon.
CError: 'finish' event does not exist on readable streams.
DTypeError because chunk is undefined.
Attempts:
2 left
💡 Hint
Check if 'finish' is the right event for reading streams.
state_output
advanced
2:00remaining
Memory usage difference between streams and full file read
If you read a 1GB file using fs.readFile versus fs.createReadStream, what is the expected memory usage behavior?
ABoth use the same memory because Node.js caches files automatically.
Bfs.readFile uses much more memory because it loads the entire file; streams use less memory by processing chunks.
CStreams use more memory because they keep multiple chunks in memory at once.
Dfs.readFile uses less memory because it reads faster.
Attempts:
2 left
💡 Hint
Think about how much data is held in memory at once.
🔧 Debug
expert
2:00remaining
Why does this stream pipeline hang without output?
Consider this Node.js code that reads a file and writes to another file using streams. Why does it hang without finishing?
Node.js
const fs = require('fs');
const readStream = fs.createReadStream('input.txt');
const writeStream = fs.createWriteStream('output.txt');
readStream.pipe(writeStream);
// No event handlers attached
AThe code is correct; it should finish but no logs show because no events are handled.
BThe writeStream is not closed, so the process waits indefinitely.
CThe readStream is not paused, causing a deadlock.
DThe pipe method is missing a callback to start the flow.
Attempts:
2 left
💡 Hint
Think about what happens if no logs or events are used.