Dear All,
To summarize this tale of woe: Still can't get the simple parallel
fipy example to work. That is,
mpirun -n 2 python mesh1D.py
gives 2 identical outputs of the calculations for each step.
Per Dr. Guyer's helpful e-mails to Igor and myself, I have verified
o trilinos 9.0.3 has been completely re-built with OpenMPI
1.3.3; trilinos examples (C++) work w/ mpirun as expected
o libpytrilinos *is* linked against libmpi (ldd libpytrilinos)
o libpytrilinos has *no* MPI symbols; *all* other .so libs in
that same lib directory *do* have MPI symbols
o mpi4py is installed in the standard site-packages location
(same place as numpy, scipy, and even fipy itself)
1) At the moment, it seems that the lack of MPI symbols in
libpytrilinos is the most fundamental issue
2) Bear with me just a moment. We have an mpi4py that
"works" -- that is, this very simple-minded program behaves
as expected (and others, not so simple-minded)
=======================
#!/usr/bin/env python
from sys import *
from mpi4py import MPI
tot_procs= MPI.COMM_WORLD.Get_size()
rank= MPI.COMM_WORLD.Get_rank()
machine= MPI.Get_processor_name()
tst_msg= "Aloha, World. This proc= %d (of %d) is running on
machine= %s.\n"
stdout.write(tst_msg % (rank, tot_procs, machine))
=======================
But, obviously, to run properly this example must import
mpi4py.
3) HOW DOES PyTrilinos KNOW ABOUT mpi4py?
At the moment, I think that's the problem. By hook or by
crook something in PyTrilinos has to "import mpi4py";
how does that happen? Or, how *should* it happen, since
apparently it isn't? Does mpi4py perhaps need to be located
somewhere specifically?
All the trilinos configuration flags have to do with "mpi"; there
are no python stubs of any kind in my release of OpenMPI.
there are cryptic/generic import statements in the PyTrilinos
files, but nowhere is anything like mpi4py explicitly called out,
that I have been able to find.
I use only 2 mpi flags when configuring trilinos
--with-mpi=[...]/trilinos/OPENMPI \
--with-mpi-compilers \
I believe that the code implementing the bindings between
mpi4py and (standard) MPI is located in a lib called MPI.so;
which, at our facility, after an standard install of mpi4py,
wound up here
[our_python_area]/site-packages/mpi4py/MPI.so
(As mentioned, this is the same "site-packages" containing
things like numpy and fipy).
5) Something else that might help: if you have a working
fipy/trilinos installation: are you using mpi4py? where is
it installed? is it in fact associated with libpytrilinos (does
"ldd libpytrilinos" list MPI.so)?
That's enough grief for one day... thanks for your help.
+jtg+