Memory forensics basics in Cybersecurity - Time & Space Complexity
When analyzing memory forensics, it is important to understand how the time to analyze memory grows as the size of the memory image increases.
We want to know how the effort to scan and extract useful data changes when the memory dump gets bigger.
Analyze the time complexity of the following memory scanning code.
for each byte in memory_dump:
if byte matches pattern:
record location
continue scanning
This code scans every byte in a memory dump to find a specific pattern and records where it appears.
Look for loops or repeated checks in the code.
- Primary operation: Checking each byte in the memory dump for a pattern.
- How many times: Once for every byte in the memory dump.
As the memory dump size grows, the number of bytes to check grows too.
| Input Size (n) | Approx. Operations |
|---|---|
| 10 bytes | 10 checks |
| 100 bytes | 100 checks |
| 1000 bytes | 1000 checks |
Pattern observation: The number of operations grows directly with the size of the memory dump.
Time Complexity: O(n)
This means the time to scan grows in a straight line as the memory size grows.
[X] Wrong: "The scan time stays the same no matter how big the memory is."
[OK] Correct: Because the code checks every byte, more memory means more checks and more time.
Understanding how scanning time grows with memory size helps you explain how forensic tools handle large data efficiently.
"What if the code searched only every other byte instead of every byte? How would the time complexity change?"