Task.WhenAll with Select is a footgun - but why?

N+1 Problem

Putting threads, tasks, async, parallelism to one side, what you describe is an N+1 problem, which is something to avoid for exactly what happened to you. It's all well and good when N (your user count) is small, but it grinds to a halt as the users grow.

You may want to find a different solution. Do you have to do this operation for all users? If so, then maybe switch to a background process and fan-out for each user.

Back to the footgun (I had to look that up BTW ).

Tasks are a promise, similar to JavaScript. In .NET they may complete on a separate thread - usually a thread from the thread pool.

In .NET Core, they usually do complete on a separate thread if not complete and the point of awaiting, for an HTTP request that is almost certain to be the case.

You may have exhausted the thread pool, but since you're making HTTP requests, I suspect you've exhausted the number of concurrent outbound HTTP requests instead. "The default connection limit is 10 for ASP.NET hosted applications and 2 for all others." See the documentation here.

Is there a way to achieve some parallelism and not take exhaust a resource (threads or http connections)? - Yes.

Here's a pattern I often implement for just this reason, using Batch() from morelinq.

IEnumerable<User> users = Enumerable.Empty<User>();
IEnumerable<IEnumerable<string>> batches = userIds.Batch(10);
foreach (IEnumerable<string> batch in batches)
{
    Task<User> batchTasks = batch.Select(userId => GetUserDetailsAsync(userId));
    User[] batchUsers = await Task.WhenAll(batchTasks);
    users = users.Concat(batchUsers);
}

You still get ten asynchronous HTTP requests to GetUserDetailsAsync(), and you don't exhaust threads or concurrent HTTP requests (or at least max out with the 10).

Now if this is a heavily used operation or the server with GetUserDetailsAsync() is heavily used elsewhere in the app, you may hit the same limits when your system is under load, so this batching is not always a good idea. YMMV.


You already have an excellent answer here, but just to chime in:

There's no problem with creating thousands of tasks. They're not threads.

The core problem is that you're hitting the API way too much. So the best solutions are going to change how you call that API:

  1. Do you really need user details for thousands of users, all at once? If this is for a dashboard display, then change your API to enforce paging; if this is for a batch process, then see if you can access the data directly from the batch process.
  2. Use a batch route for that API if it supports one.
  3. Use caching if possible.
  4. Finally, if none of the above are possible, look into throttling the API calls.

The standard pattern for asynchronous throttling is to use SemaphoreSlim, which looks like this:

using var throttler = new SemaphoreSlim(10);
var userTasks = userIds.Select(async userId =>
{
  await throttler.WaitAsync();
  try { await GetUserDetailsAsync(userId); }
  finally { throttler.Release(); }
});
var users = await Task.WhenAll(tasks); // users is User[]

Again, this kind of throttling is best only if you can't make the design changes to avoid thousands of API calls in the first place.