Proper use of mutexes in Python

I would like to improve answer from chris-b a little bit more.

See below for my code:

from threading import Thread, Lock
import threading
mutex = Lock()


def processData(data, thread_safe):
    if thread_safe:
        mutex.acquire()
    try:
        thread_id = threading.get_ident()
        print('\nProcessing data:', data, "ThreadId:", thread_id)
    finally:
        if thread_safe:
            mutex.release()


counter = 0
max_run = 100
thread_safe = False
while True:
    some_data = counter        
    t = Thread(target=processData, args=(some_data, thread_safe))
    t.start()
    counter = counter + 1
    if counter >= max_run:
        break

In your first run if you set thread_safe = False in while loop, mutex will not be used, and threads will step over each others in print method as below;

Not Thread safe

but, if you set thread_safe = True and run it, you will see all the output comes perfectly fine;

Thread safe

hope this helps.


This is the solution I came up with:

import time
from threading import Thread
from threading import Lock

def myfunc(i, mutex):
    mutex.acquire(1)
    time.sleep(1)
    print "Thread: %d" %i
    mutex.release()


mutex = Lock()
for i in range(0,10):
    t = Thread(target=myfunc, args=(i,mutex))
    t.start()
    print "main loop %d" %i

Output:

main loop 0
main loop 1
main loop 2
main loop 3
main loop 4
main loop 5
main loop 6
main loop 7
main loop 8
main loop 9
Thread: 0
Thread: 1
Thread: 2
Thread: 3
Thread: 4
Thread: 5
Thread: 6
Thread: 7
Thread: 8
Thread: 9

I don't know why you're using the Window's Mutex instead of Python's. Using the Python methods, this is pretty simple:

from threading import Thread, Lock

mutex = Lock()

def processData(data):
    mutex.acquire()
    try:
        print('Do some stuff')
    finally:
        mutex.release()

while True:
    t = Thread(target = processData, args = (some_data,))
    t.start()

But note, because of the architecture of CPython (namely the Global Interpreter Lock) you'll effectively only have one thread running at a time anyway--this is fine if a number of them are I/O bound, although you'll want to release the lock as much as possible so the I/O bound thread doesn't block other threads from running.

An alternative, for Python 2.6 and later, is to use Python's multiprocessing package. It mirrors the threading package, but will create entirely new processes which can run simultaneously. It's trivial to update your example:

from multiprocessing import Process, Lock

mutex = Lock()

def processData(data):
    with mutex:
        print('Do some stuff')

if __name__ == '__main__':
    while True:
        p = Process(target = processData, args = (some_data,))
        p.start()