12

I'd like to use python's multiprocessing module to utilize a multi-core Linux server.

I need all processes to have read/write access to the same shared memory.

Instead of using a list or a queue, is it possible to have a multi-dimentional numpy array as the shared object?

3 Answers 3

9

I think I know what you're looking for: https://bitbucket.org/cleemesser/numpy-sharedmem/issue/3/casting-complex-ndarray-to-float-in

There's a short description on the web page saying: A shared memory module for numpy by Sturla Molden and G. Varoquaux that makes it easy to share memory between processes in the form of NumPy arrays. Originally posted to SciPy-user mailing list.

I, myself am using it just that way. Sharing NumPy arrays between processes. Works very well for me.

Sign up to request clarification or add additional context in comments.

1 Comment

Could you please update the above link? It doesn't lead to the code anymore. Thank you!
6

Look at this. I doesn't seem easy, but it's doable.

Edit: Link rotted, I have linked to another copy.

1 Comment

Link has rotted away.
1

I found that even if you do not modify your numpy array after fork()'ing a bunch of child processes, you will still see your RAM skyrocket as childprocesses copy-on-write the object for some reason.

You can limit (or totally alleviate?) this problem by setting

"yourArray.flags.writeable = False"

BEFORE fork()'ing/Pool()'ing which seems to keep the RAM used down, and is a LOT less hassle than the other methods :)

1 Comment

the correct syntax is : myarray.flags['WRITEABLE'] = False

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.