Is it possible to multiprocess a function that returns something in Python?

Solution 1:

You are looking to do some embarrassingly parallel work using multiple processes, so why not use a Pool? A Pool will take care of starting up the processes, retrieving the results, and returning the results to you.

I use pathos, which has a fork of multiprocessing, because it has much better serialization than the version that standard library provides.

(.py) file

from pathos.multiprocessing import ProcessingPool as Pool

def foo(obj1, obj2):
    a = obj1.x**2
    b = obj2.x**2
    return a,b

class Bar(object):
    def __init__(self, x):
        self.x = x

Pool().map(foo, [Bar(1),Bar(2),Bar(3)], [Bar(4),Bar(5),Bar(6)])

Result

[(1, 16), (4, 25), (9, 36)]

And you see that foo takes two arguments, and returns a tuple of two objects. The map method of Pool submits foo to the underlying processes and returns the result as res.

You can get pathos here: https://github.com/uqfoundation

Solution 2:

Yes, sure - you can use a number of methods. One of the easiest ones is a shared Queue. See an example here: http://eli.thegreenplace.net/2012/01/16/python-parallelizing-cpu-bound-tasks-with-multiprocessing/