Running JavaScript using Node.js - Time & Space Complexity
When running JavaScript with Node.js, it's helpful to understand how the time your code takes grows as you give it more work.
We want to see how the number of operations changes when the input size changes.
Analyze the time complexity of the following code snippet.
const fs = require('fs');
function readFiles(fileNames) {
fileNames.forEach(name => {
const content = fs.readFileSync(name, 'utf8');
console.log(content.length);
});
}
readFiles(['file1.txt', 'file2.txt', 'file3.txt']);
This code reads several files one by one and prints their length.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Reading each file synchronously inside a loop.
- How many times: Once for each file in the input list.
As you add more files, the total time grows roughly in direct proportion to the number of files.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 file reads |
| 100 | 100 file reads |
| 1000 | 1000 file reads |
Pattern observation: Doubling the number of files roughly doubles the work done.
Time Complexity: O(n)
This means the time grows in a straight line with the number of files you read.
[X] Wrong: "Reading multiple files at once will always be faster and have constant time."
[OK] Correct: Even if you read files in parallel, the total work still depends on how many files there are, so time grows with input size.
Understanding how your Node.js code scales with input size shows you can write efficient programs and reason about performance clearly.
"What if we changed from reading files synchronously to reading them asynchronously with promises? How would the time complexity change?"