0
0
Digital Marketingknowledge~5 mins

Growth metrics and north star metric in Digital Marketing - Time & Space Complexity

Choose your learning style9 modes available
Time Complexity: Growth metrics and north star metric
O(n)
Understanding Time Complexity

When tracking growth metrics and the north star metric, it's important to understand how measuring more data points affects the time it takes to analyze results.

We want to know how the effort to process these metrics grows as we collect more data.

Scenario Under Consideration

Analyze the time complexity of the following process for calculating growth metrics.


// Assume dataPoints is an array of user actions
function calculateGrowthMetrics(dataPoints) {
  let totalActions = 0;
  for (let i = 0; i < dataPoints.length; i++) {
    totalActions += dataPoints[i].count;
  }
  let northStarMetric = totalActions / dataPoints.length;
  return northStarMetric;
}
    

This code sums user actions and calculates the average as the north star metric.

Identify Repeating Operations

Identify the loops, recursion, array traversals that repeat.

  • Primary operation: Looping through each data point once to sum counts.
  • How many times: Exactly once for each data point in the input array.
How Execution Grows With Input

As the number of data points increases, the time to calculate the total actions grows proportionally.

Input Size (n)Approx. Operations
1010 additions
100100 additions
10001000 additions

Pattern observation: The time grows steadily as more data points are processed, roughly doubling when the input doubles.

Final Time Complexity

Time Complexity: O(n)

This means the time to calculate the north star metric grows in direct proportion to the number of data points.

Common Mistake

[X] Wrong: "Calculating the north star metric takes the same time no matter how many data points there are."

[OK] Correct: Because the calculation must look at each data point once, more data means more work and more time.

Interview Connect

Understanding how processing time grows with data size helps you explain how to handle large datasets efficiently in real projects.

Self-Check

What if we stored a running total and updated it with each new data point instead of recalculating from scratch? How would the time complexity change?