multiprocessing returns "too many open files" but using `with...as` fixes it. Why?

You're creating new processes inside a loop, and then forgetting to close them once you're done with them. As a result, there comes a point where you have too many open processes. This is a bad idea.

You could fix this by using a context manager which automatically calls pool.terminate, or manually call pool.terminate yourself. Alternatively, why don't you create a pool outside the loop just once, and then send tasks to the processes inside?

pool = multiprocessing.Pool(nprocess) # initialise your pool
for nprocess in process_per_cycle:
    ...       
    pool.map(cycle, offsets) # delegate work inside your loop

pool.close() # shut down the pool

For more information, you could peruse the multiprocessing.Pool documentation.


It is context manger. Using with ensures that you are opening and closing files properly. To understand this in detail, I'd recommend this article https://jeffknupp.com/blog/2016/03/07/python-with-context-managers/