When NOT to use map / filter / reduce
Avoid functional iteration when: you need early exit (use `for` / `for..of` / `find` / `some`), you're chaining many passes over a large array (multiple traversals + allocations — use one for loop), the side effect is the point (`forEach`, but `for..of` reads better), or readability suffers from a tangled `reduce`. Idiomatic for transforms; not a universal hammer.
map / filter / reduce are great for pure transformations. They become problems in specific shapes.
1. You need early exit
map/filter/reduce always visit every element. To stop early:
// Bad
arr.filter((x) => x.id === target)[0];
// Good
arr.find((x) => x.id === target);
// Or
for (const x of arr) if (x.id === target) return x;find, some, every short-circuit; filter doesn't.
2. Multiple chained passes on a large array
const result = items
.filter((x) => x.active)
.map((x) => x.amount)
.reduce((sum, n) => sum + n, 0);For a 1M-element array this is 3 traversals + 2 intermediate arrays. The clearer-and-faster version:
let sum = 0;
for (const x of items) if (x.active) sum += x.amount;One pass, no allocation. For small arrays (< few thousand) the difference is negligible — readability wins. For hot paths or large data, fuse.
3. The side effect is the point
forEach exists, but it's awkward — no break, no return, callback overhead:
arr.forEach((x) => log(x)); // ok
for (const x of arr) log(x); // clearer, supports break/continue4. The reduce is hard to read
reduce shines for genuine folds (sum, max, build-a-map). It's commonly abused to mean "do many things":
// Hard to read
const { active, archived } = items.reduce(
(acc, x) => {
if (x.archived) acc.archived.push(x); else acc.active.push(x);
return acc;
},
{ active: [], archived: [] },
);
// Easier
const active = [];
const archived = [];
for (const x of items) (x.archived ? archived : active).push(x);If you can't explain the reducer's accumulator in one sentence, rewrite it.
5. Async work
array.map(async ...) returns an array of Promises — newcomers expect awaited results. You almost always want:
const results = await Promise.all(items.map((x) => fetch(x.url)));
// or for limits: p-limit / for..of with await6. Mutating in a map
.map is for new values; mutating the input mid-map is a smell. Use a for loop and be explicit.
7. Hot inner loops in libraries
Some hot paths (rendering, geometry, parsers) prefer raw for for the JIT's predictability and to avoid callback overhead. Most app code never reaches this regime; don't preoptimize.
Decision
| Situation | Use |
|---|---|
| Pure transform list → list | map |
| Filter | filter |
| Fold to single value | reduce |
| Find first match | find |
| Any/all | some/every |
| Early exit | for..of + break |
| Many chained ops on big arrays | one for loop |
| Async per item | Promise.all(map) or controlled concurrency |
Interview framing
"They're great for pure transforms. I avoid them when: I need early exit (find/some/for..of); I'm chaining many passes on a big array (allocations + traversals — fuse into one for); the side effect is the point (for..of reads better than forEach); the reduce has gotten unreadable; or I'm doing async work where Promise.all(map(...)) is what I actually want. Hot-path code in libraries sometimes benefits from raw loops, but in app code I optimize for readability and let the JIT do its job."
Follow-up questions
- •How would you implement a limited-concurrency map?
- •When does reduce read better than for?
- •Why is filter + find slower than just find?
Common mistakes
- •Filter + [0] instead of find.
- •Chaining 4 passes on a million-row array.
- •array.map(async) without awaiting.
- •Reducer doing 5 things at once.
Performance considerations
- •Each chained method allocates an intermediate array. For small arrays this is free; for large or hot loops, fuse into one pass.
Edge cases
- •Sparse arrays — map skips holes; for..of doesn't.
- •this in callbacks — arrow vs regular.
- •Mutating the source during iteration.
Real-world examples
- •React render lists, lodash/fp pipelines, parsers and tokenizers using raw loops, server log processing.