I am not very expert of parallel processing in Python, but as no one else 
is answering, and supposing you mean multithreading, I'll point out the 
problem concerned with multithreading in Python.

Python's main implementation, CPython, has a Global Interpreter Lock (GIL), 
which forbids the interpreter to create new threads. When you use modules 
such as threads, CPython is still using a single thread which emulates 
multithreading. I don't believe you will get some speedup, that threading 
support looks more like a control flow on the code.

http://en.wikipedia.org/wiki/Global_Interpreter_Lock

Multiprocessing should be able to fork an independent process on the CPU, 
but unfortunately a new process requires the memory space of the previous 
process to be cloned as well, which may be not very comfortable. IPython 
supports cluster distribution of computations on many computers.

PyPy should have a better support for multithreading, as far as I know.

On Friday, April 25, 2014 6:13:01 PM UTC+2, brombo wrote:
>
> Has anyone had any experience in using parallel python with sympy. It 
> seems to me that there are probably a lot of loops that have independent 
> operations. 
>

-- 
You received this message because you are subscribed to the Google Groups 
"sympy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/sympy.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/sympy/455f0bb6-5e0e-433c-81af-d68ab1da41ad%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to