Python wait for processes in multiprocessing Pool to complete without either closing Pool or use map()

I have a code piece like below

pool = multiprocessing.Pool(10)
for i in range(300):
    for m in range(500):
        data = do_some_calculation(resource)
        pool.apply_async(paralized_func, data, call_back=update_resource)
    # need to wait for all processes finish
    # {...}
    # Summarize resource
    do_something_with_resource(resource)

So basically I have 2 loops. I init process pool once outside the loops to avoid overheating. At the end of 2nd loop, I want to summarize the result of all processes.

Problem is that I can't use pool.map() to wait because of variation of data input. I can't use pool.join() and pool.close() either because I still need to use the pool in next iteration of 1st loop.

What is the good way to wait for processes to finish in this case?

I tried checking for pool._cache at the end of 2nd loop.

while len(process_pool._cache) > 0:
    sleep(0.001)

This way works but look weird. Is there a better way to do this?


apply_async will return an AsyncResult object. This object has a method wait([timeout]), you can use it.

Example:

pool = multiprocessing.Pool(10)
for i in range(300):
    results = []
    for m in range(500):
        data = do_some_calculation(resource)
        result = pool.apply_async(paralized_func, data, call_back=update_resource)
        results.append(result)
    [result.wait() for result in results]
    # need to wait for all processes finish
    # {...}
    # Summarize resource
    do_something_with_resource(resource)

I haven't checked this code as it is not executable, but it should work.