On 10/14/06, Bill Spotz <[EMAIL PROTECTED]> wrote:
> I would like to second the notion of converging on a single MPI
> interface.  My parallel project encapsulates most of the inter-
> processor communication within higher-level objects because the lower-
> level communication patterns can usually be determined from higher-
> level data structures.  But still, there are times when a user would
> like access to the lower-level MPI interface.

Using mpi4py, you have access to almost all MPI internals directly
from the Python side, with an API really similar to MPI-2 C++ bindigs.
This is a feature I've not seen in other Python bindings for MPI. I
think this is really important for developers and people learning MPI;
you do not need to learn a new API.


-- 
Lisandro Dalcín
---------------
Centro Internacional de Métodos Computacionales en Ingeniería (CIMEC)
Instituto de Desarrollo Tecnológico para la Industria Química (INTEC)
Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET)
PTLC - Güemes 3450, (3000) Santa Fe, Argentina
Tel/Fax: +54-(0)342-451.1594

-------------------------------------------------------------------------
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=120709&bid=263057&dat=121642
_______________________________________________
Numpy-discussion mailing list
Numpy-discussion@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/numpy-discussion

Reply via email to