Why forEach Ignores Your await (And What to Use Instead)
I once spent an embarrassing amount of time debugging a MERN project where data kept arriving out of order and console.log statements were firing before the awaited calls they depended on. The code looked correct. I'd read it line by line. I'd checked the backend. Everything looked fine.
The culprit was a single forEach.
What the broken code looked like
arr.forEach(async (item) => {
const res = await someAPICall(item)
processResult(res)
})
console.log('done') // fires before any of the above finishes
Visually, this reads like "iterate over each item, wait for the API call, then process the result." That's not what happens.
Why it breaks
JavaScript is single-threaded. forEach is not creating threads or running callbacks in parallel — that's not the issue. The issue is simpler and more fundamental.
forEach was designed before async/await existed. Its contract is: call each callback, ignore the return value. When you pass an async function as the callback, that function returns a Promise — and forEach simply discards it without waiting on it.
So what actually happens is:
forEachcalls the first callback, which starts an async operation and returns a PromiseforEachthrows that Promise away and immediately calls the second callback- Same for the third, fourth, and so on
forEachreturns, and the loop is considered "done" — even though none of the async work has completed
You can see this clearly with a minimal example:
const arr = [1, 2, 3]
arr.forEach(async (n) => {
await new Promise(resolve => setTimeout(resolve, 1000))
console.log(n)
})
console.log('forEach returned')
Output:
forEach returned ← immediate
1 ← 1 second later
2
3
The loop returned before a single iteration finished. Any code after the forEach runs against unresolved state.
Fix 1 — Sequential with for...of
If order matters and each iteration should wait for the previous one to finish before starting:
for (const item of arr) {
const res = await someAPICall(item)
processResult(res)
}
This is the direct drop-in replacement. Each await pauses the loop until the promise resolves, then moves to the next item. The tradeoff: if you have 10 items and each call takes 500ms, you're waiting 5 seconds total.
Fix 2 — Parallel with Promise.all
If order doesn't matter and you want all calls to run at the same time:
await Promise.all(arr.map(async (item) => {
const res = await someAPICall(item)
processResult(res)
}))
map collects all the returned Promises into an array, and Promise.all waits until every one of them has resolved. With the same 10 items at 500ms each, you're now waiting ~500ms total.
This is the right pattern for independent operations where you want maximum throughput — fetching multiple records, running parallel enrichment steps, firing multiple writes that don't depend on each other.
Fix 3 — Parallel with a concurrency limit
Promise.all runs everything at once, which can overwhelm a rate-limited API or flood a database with connections. If you need concurrency but with a cap, reach for a library like p-limit:
import pLimit from 'p-limit'
const limit = pLimit(3) // max 3 concurrent
await Promise.all(
arr.map((item) =>
limit(async () => {
const res = await someAPICall(item)
processResult(res)
})
)
)
This processes items in parallel but never more than 3 at a time. Useful when hitting external APIs with rate limits, or when you're doing heavy I/O and don't want to saturate the connection pool.
Choosing the right fix
| Scenario | Pattern |
|---|---|
| Order matters, each step depends on the previous | for...of |
| Independent operations, want max speed | Promise.all |
| Independent operations, API/DB has rate limits | Promise.all + p-limit |
The key rule: if you're using await inside a loop, forEach is almost never what you want. for...of is safe by default; reach for Promise.all when you need the speed.