Python multiprocessing - check status of each processes

You can check whether process is alive after you tried to join it. Don't forget to set timeout otherwise it'll wait until job is finished.

Here is simple example for you

from multiprocessing import Process
import time

def task():
    import time
    time.sleep(5)

procs = []

for x in range(2):
    proc = Process(target=task)
    procs.append(proc)
    proc.start()

time.sleep(2)

for proc in procs:
    proc.join(timeout=0)
    if proc.is_alive():
        print "Job is not finished!"

I have found this solution time ago (somewhere here in StackOverflow) and I am very happy with it.

Basically, it uses signal to raise an exception if a process takes more than expected.

All you need to do is to add this class to your code:

import signal

class Timeout:

    def __init__(self, seconds=1, error_message='TimeoutError'):
        self.seconds = seconds
        self.error_message = error_message

    def handle_timeout(self, signum, frame):
        raise TimeoutError(self.error_message)

    def __enter__(self):
        signal.signal(signal.SIGALRM, self.handle_timeout)
        signal.alarm(self.seconds)

    def __exit__(self, type, value, traceback):
        signal.alarm(0)

Here is a general example of how it works:

import time

with Timeout(seconds=3, error_message='JobX took too much time'):
    try:
        time.sleep(10) #your job
    except TimeoutError as e:
        print(e)

In your case, I would add the with statement to the job that your worker need to perform. Then you catch the Exception and you do what you think is best.

Alternatively, you can periodically check if a process is alive:

timeout = 3 #seconds 
start = time.time()
while time.time() - start < timeout:
    if any(proces.is_alive() for proces in processes):
        time.sleep(1)
    else:
        print('All processes done')
else:
    print("Timeout!")
    # do something