What’s your approach to efficient data fetching
Use a caching data layer (React Query/SWR) for dedup, caching, and background refresh; avoid waterfalls by fetching in parallel or co-locating with the router; fetch only what you need; paginate/virtualize large sets; prefetch likely-next data; and apply optimistic updates for mutations.
Efficient data fetching is about fetching the minimum, as early as possible, as few times as possible, and reusing the result.
1. Use a caching data layer — don't hand-roll
React Query / SWR / RTK Query give you, for free:
- Caching — the same query across components hits the cache, not the network.
- Deduplication — 5 components requesting the same data → 1 request.
- Background refetch & stale-while-revalidate — show cached data instantly, refresh quietly.
- Loading/error state, retries, and cache invalidation on mutation.
Hand-rolling useEffect + fetch means reinventing all of this, badly.
2. Kill request waterfalls
The biggest real-world perf killer: component A fetches, renders B, B fetches, renders C…
- Fetch in parallel when requests are independent (
Promise.all, parallel queries). - Hoist fetching — route-level loaders (React Router data APIs, Next.js) start fetches before components render.
- Server Components / SSR — fetch on the server, ship data with the HTML.
- Watch the Network panel for the staircase pattern.
3. Fetch only what you need
- Pagination / infinite scroll / cursor-based for large lists — never fetch 10k rows.
- Field selection — GraphQL, or REST endpoints that return only required fields. Don't over-fetch.
- Code-split data with routes so each page fetches its own.
4. Fetch at the right time
- Prefetch likely-next data — on link hover, on route enter, for the next page.
- Lazy / on-demand — defer below-the-fold or behind-interaction data.
- Debounce search-as-you-type; cancel stale in-flight requests (AbortController).
5. Mutations
- Optimistic updates — update the cache immediately, roll back on failure.
- Targeted invalidation — invalidate only the affected queries, not everything.
6. Transport & infra
- HTTP caching (
ETag,Cache-Control), CDN for static/cacheable responses. - Batch related requests; compression; HTTP/2+ multiplexing.
The mental model
Treat the network as expensive and unreliable. Cache aggressively, fetch in parallel and early, request the minimum, and reuse everything. A caching library plus killing waterfalls handles 80% of real-world data-fetching performance.
Follow-up questions
- •What is a request waterfall and how do you detect and fix one?
- •Why is a caching data layer better than useEffect + fetch?
- •How do route loaders / Server Components change the fetching story?
- •How do optimistic updates work and when do they backfire?
Common mistakes
- •useEffect + fetch everywhere, re-fetching the same data and hand-rolling cache logic.
- •Sequential dependent fetches creating a waterfall.
- •Over-fetching — pulling whole objects or huge lists when a few fields/rows suffice.
- •Not cancelling stale requests on rapid input or unmount.
Performance considerations
- •Waterfalls multiply latency; parallelization and hoisting fetches collapse it. Caching/dedup cuts request count dramatically. Pagination bounds payload size. Prefetching trades a little bandwidth for perceived instant navigation.
Edge cases
- •Out-of-order responses from rapid requests.
- •Cache invalidation after a mutation affecting multiple queries.
- •Paginated data plus real-time updates.
- •SSR/hydration — avoiding a re-fetch of data already sent.
Real-world examples
- •React Query dedeuping a shared 'current user' query across a dozen components.
- •Next.js route-level data fetching eliminating client waterfalls.
- •Prefetch-on-hover making list-to-detail navigation feel instant.