Multiprocessing python within frozen script

This is not an issue of the multiprocessing library or py2exe per se but a side effect of the way you run the application. The py2exe documentation contains some discussion on this topic:

A program running under Windows can be of two types: a console program or a windows program. A console program is one that runs in the command prompt window (cmd). Console programs interact with users using three standard channels: standard input, standard output and standard error […].

As opposed to a console application, a windows application interacts with the user using a complex event-driven user interface and therefore has no need for the standard channels whose use in such applications usually results in a crash.

Py2exe will work around these issues automatically in some cases, but at least one of your processes has no attached standard output: sys.stdout is None), which means that sys.stdout.flush() is None.flush(), which yields the error you are getting. The documentation linked above has an easy fix that redirects all outputs to files.

import sys
sys.stdout = open(“my_stdout.log”, “w”)
sys.stderr = open(“my_stderr.log”, “w”)

Simply add those lines at the entry point of your processes. There is also a relevant documentation page on the interactions between Py2Exe and subprocesses.


This appears to have been a problem for quite some time - I found references going back to 2014, at least. Since it appears to be harmless, the general recommendation is to suppress the error by replacing sys.stdout (and sys.stderr, which is flushed on the next line) with a dummy. Try this:

import os
import sys
from multiprocessing import freeze_support

if __name__ == '__main__':
    if sys.stdout is None:
        sys.stdout = sys.stderr = open(os.devnull, 'w')
    freeze_support()