0
0
DSA Cprogramming~5 mins

Fractional Knapsack Problem in DSA C - Time & Space Complexity

Choose your learning style9 modes available
Time Complexity: Fractional Knapsack Problem
O(n log n)
Understanding Time Complexity

We want to understand how the time needed to solve the Fractional Knapsack Problem changes as the number of items grows.

Specifically, how does the algorithm's work increase when we have more items to consider?

Scenario Under Consideration

Analyze the time complexity of the following code snippet.


// Fractional Knapsack Problem
// items[] contains pairs of (value, weight)
// n is number of items, W is max capacity

// Step 1: Calculate value per weight for each item
// Step 2: Sort items by value per weight descending
// Step 3: Take items fully or partially until capacity fills

void fractionalKnapsack(int n, int W, Item items[]) {
    // Sort items by value/weight ratio
    qsort(items, n, sizeof(Item), compare);

    int currentWeight = 0;
    double totalValue = 0.0;
    for (int i = 0; i < n; i++) {
        if (currentWeight + items[i].weight <= W) {
            currentWeight += items[i].weight;
            totalValue += items[i].value;
        } else {
            int remain = W - currentWeight;
            totalValue += items[i].value * ((double) remain / items[i].weight);
            break;
        }
    }
}
    

This code sorts items by their value per weight and then picks as much as possible from the best items until the knapsack is full.

Identify Repeating Operations

Identify the loops, recursion, array traversals that repeat.

  • Primary operation: Sorting the array of items by value per weight.
  • How many times: Sorting takes about n log n steps for n items.
  • Secondary operation: One loop through all items to pick them.
  • How many times: This loop runs at most n times.
How Execution Grows With Input

As the number of items increases, the sorting step grows faster than the picking loop.

Input Size (n)Approx. Operations
10About 10 * log 10 ≈ 33 operations for sorting + 10 for picking
100About 100 * log 100 ≈ 664 operations for sorting + 100 for picking
1000About 1000 * log 1000 ≈ 9966 operations for sorting + 1000 for picking

Pattern observation: The sorting step dominates and grows faster than the picking loop as n increases.

Final Time Complexity

Time Complexity: O(n log n)

This means the time needed grows a bit faster than just the number of items because sorting takes extra steps as the list grows.

Common Mistake

[X] Wrong: "The picking loop is the slowest part, so the time complexity is O(n)."

[OK] Correct: Sorting the items takes more time than just picking them, so the overall time depends mostly on sorting, which is O(n log n).

Interview Connect

Understanding how sorting affects the time helps you explain why some steps take longer and shows you can analyze algorithms beyond just loops.

Self-Check

"What if the items were already sorted by value per weight? How would the time complexity change?"