Bird
0
0
DSA Cprogramming~5 mins

HashMap Implementation from Scratch in DSA C - Time & Space Complexity

Choose your learning style9 modes available
Time Complexity: HashMap Implementation from Scratch
O(k + n/m)
Understanding Time Complexity

When building a HashMap from scratch, it is important to understand how fast it can add, find, or remove items.

We want to know how the time to do these actions changes as the number of stored items grows.

Scenario Under Consideration

Analyze the time complexity of the following code snippet.


#define SIZE 100

typedef struct Node {
    char* key;
    int value;
    struct Node* next;
} Node;

Node* hashmap[SIZE];

unsigned int hash(char* key) {
    unsigned int hash = 0;
    for (int i = 0; key[i] != '\0'; i++) {
        hash = (hash * 31) + key[i];
    }
    return hash % SIZE;
}

void put(char* key, int value) {
    unsigned int index = hash(key);
    Node* newNode = malloc(sizeof(Node));
    newNode->key = key;
    newNode->value = value;
    newNode->next = hashmap[index];
    hashmap[index] = newNode;
}
    

This code adds a key-value pair to a simple HashMap using chaining for collisions.

Identify Repeating Operations

Identify the loops, recursion, array traversals that repeat.

  • Primary operation: The loop inside the hash function that processes each character of the key string.
  • How many times: Once per character in the key, so proportional to key length.
  • Secondary operation: Traversing the linked list at hashmap[index] when searching or inserting (not shown here but relevant for get/remove).
  • How many times: Depends on number of items in that bucket (collision chain length).
How Execution Grows With Input

The time to compute the hash grows with the length of the key, which is usually small and constant.

Inserting is fast if the bucket has few items, but if many keys collide, the linked list grows and operations slow down.

Input Size (n)Approx. Operations
10~10 hash steps + short list traversal (few items)
100~10 hash steps + longer list traversal if collisions increase
1000~10 hash steps + possibly long list traversal if many collisions

Pattern observation: Hashing cost stays about the same, but collision chains can grow, making operations slower if the table is too small.

Final Time Complexity

Time Complexity: O(k + n/m)

This means time grows with key length k plus average items per bucket n/m, where n is total items and m is bucket count.

Common Mistake

[X] Wrong: "HashMap operations always take constant time no matter what."

[OK] Correct: If many keys collide, linked lists grow and operations slow down, so time depends on how well the hash spreads keys.

Interview Connect

Understanding how HashMaps work under the hood helps you explain trade-offs and choose the right data structure confidently.

Self-Check

"What if we increased the number of buckets (SIZE) as the number of items grows? How would the time complexity change?"