Implement memoization for an async function with callbacks (cache, deep equality, parallel calls)
Cache by a stable key derived from arguments. Return the cached Promise (so concurrent calls with the same args share one in-flight request — request deduplication). On rejection, evict so retries are possible. For object args, use deep-equality keying (JSON.stringify with sorted keys, or a structured-clone hash). TTL for staleness; capacity bound for memory.
Core idea
function memoizeAsync(fn, { keyFn = JSON.stringify, ttlMs = Infinity, max = 500 } = {}) {
const cache = new Map(); // key → { promise, expiresAt }
return function (...args) {
const key = keyFn(args);
const entry = cache.get(key);
const now = Date.now();
if (entry && entry.expiresAt > now) {
cache.delete(key);
cache.set(key, entry); // LRU bump
return entry.promise;
}
const promise = Promise.resolve()
.then(() => fn.apply(this, args))
.catch((err) => { cache.delete(key); throw err; }); // evict on failure
cache.set(key, { promise, expiresAt: ttlMs === Infinity ? Infinity : now + ttlMs });
if (cache.size > max) {
// Drop the oldest (LRU)
const oldest = cache.keys().next().value;
cache.delete(oldest);
}
return promise;
};
}Usage
const fetchUser = memoizeAsync(async (id) => {
const r = await fetch(`/users/${id}`);
return r.json();
}, { ttlMs: 60_000, max: 1000 });
const [a, b] = await Promise.all([fetchUser(1), fetchUser(1)]);
// Single network request — both awaits share the same Promise (request dedup).Key design choices
1. Cache the Promise, not the value
Two concurrent callers with the same key share the same in-flight Promise — request deduplication for free. Caching only the resolved value would race.
2. Evict on rejection
If the underlying call fails, drop the entry so retries fire fresh. Otherwise the cache permanently remembers a failure.
3. Deep-equality keying
For object args (fetchUser({ id: 1, expand: ["posts"] })), use a stable serializer:
function stableStringify(value) {
if (value === null || typeof value !== "object") return JSON.stringify(value);
if (Array.isArray(value)) return "[" + value.map(stableStringify).join(",") + "]";
return "{" + Object.keys(value).sort().map((k) => JSON.stringify(k) + ":" + stableStringify(value[k])).join(",") + "}";
}Sorts keys so {a:1, b:2} and {b:2, a:1} produce the same key. Handles nested objects.
4. TTL
For data that goes stale (user profile after edit). null / Infinity for no expiry.
5. Capacity (LRU)
Bound memory. Map preserves insertion order — bump on access by delete + set, evict oldest when over capacity. For real LRU at scale, use a doubly-linked list.
6. WeakMap for object-keyed memoization
If args are objects you can use directly as keys:
const cache = new WeakMap();
function memo(fn) {
return (obj) => {
if (cache.has(obj)) return cache.get(obj);
const p = fn(obj);
cache.set(obj, p);
return p;
};
}WeakMap auto-evicts when the key object is GC'd.
When NOT to memoize
- Function has side effects (mutations, logs).
- Inputs are non-deterministic (uses time, random).
- Results are huge (memory).
- Concurrency is the real win, not caching — use a queue/dedup instead.
Edge cases
- Callback-style functions (not Promise-returning): wrap into Promise first or pass the callback through with care.
- AbortSignal in args — exclude from the cache key (don't dedup based on signal identity).
- Multiple call sites sharing the same memoized fn — that's the point; clear cache if state changes invalidate it.
Interview framing
"Cache the Promise (not the resolved value) keyed by a stable serialization of the args — that gives you request deduplication for free: concurrent calls with the same args share one in-flight request. Evict on rejection so retries work. Add TTL for stale data and an LRU bound for memory. For deep equality on object args, use a key-sorting JSON serializer. WeakMap is cleaner if your keys are objects and you want GC-based eviction. Skip memoization for impure functions or unbounded results."
Follow-up questions
- •How would you add request cancellation?
- •Compare with React Query's caching.
- •When is WeakMap-based memoization better?
Common mistakes
- •Caching the resolved value instead of the Promise.
- •Not evicting on rejection.
- •Unstable JSON.stringify on object args.
- •Unbounded cache.
Performance considerations
- •Memory grows with cache; bound with TTL + LRU. Key serialization is O(arg size).
Edge cases
- •Function called with same args during in-flight — should dedup.
- •TTL expiry while in-flight.
- •AbortSignal in args.
Real-world examples
- •React Query, SWR, dataloader, axios-cache-adapter.