Advanced Async Patterns
HardBeyond basic async/await lies a set of powerful orchestration patterns for managing complex asynchronous workflows. These include async iterators for streaming data, retry strategies with exponential backoff, concurrent task pools with configurable limits, rate limiters, and mutex/semaphore patterns for coordinating shared resources. Mastering these patterns is the difference between code that works in demos and code that survives production traffic.
Interactive Visualization
1let currentId = 023async function search(query) {4 const id = ++currentId5 const res = await fetch("/api?q=" + query)6 const data = await res.json()7 if (id === currentId) {8 updateUI(data) // Only latest!9 }10}1112search("a") // id: 113search("ab") // id: 214search("abc") // id: 3
Understanding Advanced Async Patterns
Advanced async patterns go beyond the basics of async/await to solve the real-world challenges of orchestrating many concurrent operations. While Promise.all is sufficient for a handful of parallel tasks, production systems often need to process hundreds or thousands of items without overwhelming downstream services. Concurrent task pools let you cap parallelism at a configurable limit, processing items as fast as possible without exceeding resource constraints.
Retry logic with exponential backoff is essential for any system that communicates over a network. Transient failures — server overloads, DNS hiccups, brief outages — are inevitable. A well-designed retry strategy waits progressively longer between attempts and adds random jitter so that thousands of clients don't retry in lockstep. Combined with AbortController for cancellation and async generators for lazy data streaming, these patterns form the building blocks of resilient JavaScript applications.
Understanding these patterns is a strong differentiator in senior-level interviews. Interviewers look for candidates who can discuss trade-offs: when to use a task pool vs. unbounded parallelism, how to choose backoff parameters, and how to integrate cancellation cleanly through an async pipeline. Being able to implement these from scratch demonstrates deep fluency with Promises and async control flow.
Key Points
- Async generators (async function*) produce values on demand from asynchronous sources
- AbortController integrates with fetch, streams, and custom async workflows for clean cancellation
- Retry with exponential backoff prevents thundering-herd problems on transient failures
- Rate limiting (token bucket, sliding window) prevents API quota exhaustion
- Concurrent task pools cap parallelism to avoid overwhelming servers or memory
- Async mutex/semaphore patterns coordinate access to shared resources without race conditions
- Combining these patterns produces resilient data pipelines for production systems
Code Examples
Async Generator for Paginated API
async function* fetchAllPages(baseUrl, signal) { let page = 1; let hasMore = true; while (hasMore) { const res = await fetch( `${baseUrl}?page=${page}`, { signal } ); if (!res.ok) throw new Error(`HTTP ${res.status}`); const data = await res.json(); yield* data.items; // Yield each item individually hasMore = data.hasNextPage; page++; } } // Consumer controls iteration for await (const item of fetchAllPages('/api/users', signal)) { process(item); if (shouldStop()) break; // Generator cleans up }
Async generators lazily fetch pages on demand. The consumer pulls items one at a time and can break early without wasting network requests.
Retry with Exponential Backoff
async function withRetry(fn, options = {}) { const { maxAttempts = 3, baseDelayMs = 1000, maxDelayMs = 30000, shouldRetry = () => true, } = options; let attempt = 0; while (true) { try { return await fn(); } catch (error) { attempt++; if (attempt >= maxAttempts || !shouldRetry(error)) { throw error; } const delay = Math.min( baseDelayMs * 2 ** (attempt - 1) + Math.random() * 1000, maxDelayMs ); await new Promise(r => setTimeout(r, delay)); } } } // Usage const data = await withRetry( () => fetch('/api/flaky-endpoint').then(r => r.json()), { maxAttempts: 5, shouldRetry: (err) => err.status !== 401, } );
Exponential backoff doubles the wait time after each failure with jitter to prevent synchronized retries across clients.
Concurrent Task Pool with Limit
async function pooledMap(items, fn, concurrency = 5) { const results = new Array(items.length); let nextIndex = 0; async function worker() { while (nextIndex < items.length) { const index = nextIndex++; results[index] = await fn(items[index], index); } } const workers = Array.from( { length: Math.min(concurrency, items.length) }, () => worker() ); await Promise.all(workers); return results; } // Process 100 URLs with max 5 concurrent fetches const urls = Array.from({ length: 100 }, (_, i) => `/api/item/${i}`); const responses = await pooledMap( urls, async (url) => { const res = await fetch(url); return res.json(); }, 5 // concurrency limit );
A pool of worker functions pulls from a shared queue. This caps parallelism to avoid overwhelming servers while staying faster than sequential execution.
Common Mistakes
- Firing hundreds of parallel requests without concurrency limits, crashing the server or hitting rate limits
- Not passing AbortController signals through async pipelines, leaving dangling requests on cancellation
- Retrying non-idempotent operations (POST/DELETE) without understanding side-effect safety
- Leaking memory by holding references to resolved Promises in long-running loops
- Using fixed delay instead of exponential backoff, causing thundering-herd retries
Interview Tips
- Explain the difference between unbounded Promise.all and a concurrency-limited pool
- Walk through exponential backoff math: delay = baseDelay * 2^attempt + jitter
- Know when async generators are better than collecting everything into an array
- Mention real-world use cases: API rate limits, database connection pools, file processing pipelines