Exception handling in concurrent.futures.Executor.map

The map method returns a generator which allows to iterate through the results once ready.

Unfortunately, it is not possible to resume a generator after an exception occurs. From PEP 255.

If an unhandled exception-- including, but not limited to, StopIteration --is raised by, or passes through, a generator function, then the exception is passed on to the caller in the usual way, and subsequent attempts to resume the generator function raise StopIteration. In other words, an unhandled exception terminates a generator's useful life.

There are other libraries such as pebble which allow to continue the iteration after an error occurs. Check the examples in the documentation.


Ehsan's solution is good, but it may be slightly more efficient to take the results as the are completed instead of waiting for sequential items in the list to finish. Here is an example from the library docs.

import concurrent.futures
import urllib.request

URLS = ['http://www.foxnews.com/',
        'http://www.cnn.com/',
        'http://europe.wsj.com/',
        'http://www.bbc.co.uk/',
        'http://some-made-up-domain.com/']

# Retrieve a single page and report the URL and contents
def load_url(url, timeout):
    with urllib.request.urlopen(url, timeout=timeout) as conn:
        return conn.read()

# We can use a with statement to ensure threads are cleaned up promptly
with concurrent.futures.ThreadPoolExecutor(max_workers=5) as executor:
    # Start the load operations and mark each future with its URL
    future_to_url = {executor.submit(load_url, url, 60): url for url in URLS}
    for future in concurrent.futures.as_completed(future_to_url):
        url = future_to_url[future]
        try:
            data = future.result()
        except Exception as exc:
            print('%r generated an exception: %s' % (url, exc))
        else:
            print('%r page is %d bytes' % (url, len(data)))

As mentioned above, unfortunately executor.map's API is limited and only lets you get the first exception. Also, when iterating through the results, you will only get values up to the first exception.

To answer your question, if you don't wan to use a different library, you can unroll your map and manually apply each function:

future_list = []
with concurrent.futures.ThreadPoolExecutor() as executor:
  for arg in range(10):
    future = executor.submit(test_func, arg)
    future_list.append(future)

for future in future_list:
  try:
    print(future.result())
  except Exception as e:
    print(e)

This allows you to handle each future individually.