I'm always excited to take on new projects and collaborate with innovative minds.
A single misplaced await slowed down an entire .NET app in production. Learn how async/await can silently throttle performance, why await inside loops causes sequential execution, and how to fix it with Task.WhenAll and throttling. A real-world lesson every C# developer should know.
We recently encountered a performance mystery in one of our .NET 7 APIs. Everything seemed fine in development and QA; but once production traffic hit a threshold, response times exploded — and our CPU was hardly taxed. It wasn’t memory, GC, scaling, or infrastructure. It was one tiny misplaced await
.
Here’s how a simple misstep turned into a production disaster — and how we fixed it.
Our API passed all tests, handled the expected load on staging, and logs showed nothing alarming. Then we launched, and things got weird:
We tried everything: deep profiling, GC tuning, scaling out. Nada.
During one late-night review, someone asked: “Why are we awaiting inside the loop?” That question turned everything around.
await
inside a loopHere’s a simplified version of what we had:
public async Task ProcessItemsAsync(List<Item> items)
{
foreach (var item in items)
{
await _externalService.SendAsync(item);
}
}
At first glance, that looks okay — you call asynchronous work per item. But the problem is subtle:
By using await
inside the loop, we force sequential execution:
“Wait for this one to finish, then move to the next.”
Under light load, the delay might be negligible. Under heavy load or when each item has multiple sub-operations, the penalty compounds deeply.
In our case, some items triggered 50+ internal operations. That meant each iteration waited for long chains of calls to complete before proceeding to the next. That serialized work severely throttled throughput.
We changed the method to:
public async Task ProcessItemsAsync(List<Item> items)
{
var tasks = items
.Select(item => _externalService.SendAsync(item));
await Task.WhenAll(tasks);
}
With this approach:
Task.WhenAll
.That means all items “work in parallel” (or as parallel as the thread pool / service constraints allow) instead of lining up one after another.
Once deployed, we saw massive improvements. Throughput climbed, response times steadied, and the bottleneck vanished.
However, a few lessons emerged:
Use throttling mechanisms
A pattern we adopted:
var semaphore = new SemaphoreSlim(5);
var tasks = items.Select(async item => {
await semaphore.WaitAsync();
try
{
await _externalService.SendAsync(item);
}
finally
{
semaphore.Release();
}
});
await Task.WhenAll(tasks);
This caps the maximum simultaneous calls (5 in this example) but still avoids full serialization.
await
, pause: “Do I need to wait here before moving on?” If not, consider concurrent alternatives.Async/await is a powerful tool — but misused, it’s a trap. That one misplaced await
was silently throttling our entire system. Once we rewrote the logic to spawn tasks and await them together, performance returned.
Whenever you reach for await
, don’t assume it’s benign. Think: is sequential wait what I really want? If not, structure for concurrency (with limits as needed). Your future self (and your ops team) will thank you.
Your email address will not be published. Required fields are marked *