readRDS and saveRDS in R Programming - Time & Space Complexity
When working with files in R, it's important to know how the time to read or save data changes as the data size grows.
We want to understand how the time cost grows when using readRDS and saveRDS functions.
Analyze the time complexity of the following code snippet.
# Save an object to a file
saveRDS(my_data, file = "data.rds")
# Read the object back from the file
loaded_data <- readRDS("data.rds")
This code saves an R object to disk and then reads it back into memory.
Identify the loops, recursion, array traversals that repeat.
- Primary operation: Reading or writing each element of the object to or from disk.
- How many times: Once for each element or data unit inside the object, depending on its size.
As the size of the data object grows, the time to save or read it grows roughly in direct proportion.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 | 10 units processed |
| 100 | 100 units processed |
| 1000 | 1000 units processed |
Pattern observation: Doubling the data size roughly doubles the time needed to read or save.
Time Complexity: O(n)
This means the time to read or save grows linearly with the size of the data.
[X] Wrong: "Reading or saving small files takes the same time as big files."
[OK] Correct: The time depends on how much data is processed; bigger files take more time because more data is handled.
Understanding how file operations scale helps you reason about program speed and efficiency in real projects.
"What if we compressed the data before saving? How would the time complexity change?"