CSV file reading and writing in PHP - Time & Space Complexity
When working with CSV files in PHP, it's important to know how the time to read or write grows as the file gets bigger.
We want to understand how the program's speed changes when the number of rows in the CSV changes.
Analyze the time complexity of the following code snippet.
$inputFile = fopen('data.csv', 'r');
$outputFile = fopen('output.csv', 'w');
while (($row = fgetcsv($inputFile)) !== false) {
// Process row data
fputcsv($outputFile, $row);
}
fclose($inputFile);
fclose($outputFile);
This code reads each row from a CSV file and writes it to another CSV file, processing one row at a time.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Reading each row from the CSV file and writing it to another file.
- How many times: Once for every row in the CSV file (let's call the number of rows n).
As the number of rows grows, the program reads and writes more rows one by one.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | About 10 reads and 10 writes |
| 100 | About 100 reads and 100 writes |
| 1000 | About 1000 reads and 1000 writes |
Pattern observation: The number of operations grows directly with the number of rows. Double the rows, double the work.
Time Complexity: O(n)
This means the time to read and write grows in a straight line with the number of rows in the CSV file.
[X] Wrong: "Reading a CSV file is always very fast and does not depend on file size."
[OK] Correct: The program reads each row one by one, so bigger files take more time because there are more rows to handle.
Understanding how file reading and writing scales helps you explain your code's efficiency clearly and shows you know how programs behave with bigger data.
"What if we read the entire CSV file into an array first, then process it? How would the time complexity change?"