No multiprocessing print outputs (Spyder)

I have recently started to delve into multiprocessing, as I believe my code can be easily parallelized. Upon working through the tutorials, though, I encountered an issue: functions distributed in a pool do not seem to print.

Here's the culprit:

__spec__ = None # This line is required for Spyder and not part of the actual example

from multiprocessing import Process
import os

def info(title):
    print(title)
    print('module name:', __name__)
    print('parent process:', os.getppid())
    print('process id:', os.getpid())

def f(name):
    info('function f')
    print('hello', name)

if __name__ == '__main__':
    info('main line')
    p = Process(target=f, args=('bob',))
    p.start()
    p.join()

The output I receive is the following:

main line 
module name: __main__ 
parent process: 10812 
process id: 11348*

Now it is clear that the console only seems to print the info function, but not any output of the f function (which is using multiprocessing.Process). I have encountered similar issues with other examples I found online: computations are done and returned correctly when using multiprocessing, but prints never show up in the console.

Does anybody know why, and how to address this issue?

On a possibly related note, I am using Python 3.6 in Spyder 3.2.4 . Spyder seems to have a few quirks, as the first line in the code already is a workaround required to allow multiprocessing to work at all, an issue I found already discussed here. A similar, unresolved issue was mentioned here.

I would appreciate any help, and a happy new year to everyone.


Solution 1:

(Spyder maintainer here) Multiprocessing doesn't work well on Windows in Spyder's IPython console. However, you can run your code in an external terminal to have the results you want.

To do that, please go to

Run > Configuration per file > Execute in an external system terminal

Solution 2:

You can run it through Spyders' IPython console by saving the function as a different .py file and importing it into the script you're running. For example, save:

def f(name):
    info('function f')
    print('hello', name)

In a file called worker.py. Then in your main file, do the following:

from multiprocessing import Process
import os
import worker

def info(title):
    print(title)
    print('module name:', __name__)
    print('parent process:', os.getppid())
    print('process id:', os.getpid())

if __name__ == '__main__':
    info('main line')
    p = Process(target=worker.f, args=('bob',))
    p.start()
    p.join()