Consider this simple cache implemented with a Map. What will be logged to the console?
const cache = new Map();
function getData(key) {
if (cache.has(key)) {
return cache.get(key);
}
const value = key * 2;
cache.set(key, value);
return value;
}
console.log(getData(3));
console.log(getData(3));Think about what happens when the key is already in the cache.
The first call calculates 3 * 2 = 6 and stores it. The second call finds the cached value 6 and returns it.
We want to store a value in a cache that expires after 1 second. Which code snippet correctly sets and removes the cache entry?
Map does not support TTL natively. You must remove the key manually after delay.
Option D sets the value and schedules a one-time removal after 1 second. Option D is invalid because Map does not accept TTL. Option D delays setting the value, not removing it. Option D repeatedly deletes the key every second, which is unnecessary.
Look at this cache code snippet. What causes the memory leak?
const cache = {};
function addToCache(key, value) {
cache[key] = value;
}
// Keys are never removed from cache
addToCache('a', 'data1');
addToCache('b', 'data2');
// ... many keys added over timeThink about what happens if you never remove keys from the cache.
The cache object keeps growing as keys are added but never removed. This causes memory to fill up over time, leading to a memory leak.
Given this cache with a max size of 2, what keys remain after the operations?
class LRUCache { constructor(maxSize) { this.maxSize = maxSize; this.cache = new Map(); } get(key) { if (!this.cache.has(key)) return null; const value = this.cache.get(key); this.cache.delete(key); this.cache.set(key, value); return value; } set(key, value) { if (this.cache.has(key)) { this.cache.delete(key); } else if (this.cache.size === this.maxSize) { const firstKey = this.cache.keys().next().value; this.cache.delete(firstKey); } this.cache.set(key, value); } } const cache = new LRUCache(2); cache.set('a', 1); cache.set('b', 2); cache.get('a'); cache.set('c', 3); console.log([...cache.cache.keys()]);
Remember that accessing a key moves it to the most recently used position.
Initially 'a' and 'b' are added. Accessing 'a' moves it to the end (most recent). Adding 'c' removes the least recently used key 'b'. So keys 'a' and 'c' remain.
Why would you choose a WeakMap over a Map for caching objects in Node.js?
Think about how garbage collection works with weak references.
WeakMap holds keys weakly, so if no other references to the key object exist, it can be garbage collected, avoiding memory leaks. Map holds strong references, preventing this.