Re: [Paraview] pvpython vtkMPIController usage? rank is always 0

2017-05-16 Thread Ephraim Obermaier
Thank you, "mpirun -n 2 pvbatch --mpi --symmetric test.py" runs as expected. But I am now using pure python with properly set library paths. It's a bit annoying that these pvbatch traps aren't documented in the vtkMPIController class reference or in

Re: [Paraview] pvpython vtkMPIController usage? rank is always 0

2017-05-16 Thread David E DeMarle
Run with --symmetric. Without it, only root node reads the script and it tells the rest of the nodes what to do via paraview's proxy mechanisms (which take effect only for vtkSMProxy and subclasses). With it, every node reads and executes the script and all nodes do their own parts behind the

Re: [Paraview] pvpython vtkMPIController usage? rank is always 0

2017-05-16 Thread Ephraim Obermaier
Thank you all for suggesting "pvbatch --mpi". At least, this returns size=2 processes, but the updated test.py (below) hangs with the following output: $ mpirun -n 2 pvbatch --mpi test.py comm: rank: 0 size: 2 Process 0 Why is "Process 1" not printed, and why does the program hang instead of

Re: [Paraview] pvpython vtkMPIController usage? rank is always 0

2017-05-16 Thread Ben Boeckel
On Tue, May 16, 2017 at 19:07:14 +0200, Ephraim Obermaier wrote: > $ mpirun -n 2 pvpython test.py I believe you want to use pvbatch for MPI-enabled Python scripts. --Ben ___ Powered by www.kitware.com Visit other Kitware open-source projects at

Re: [Paraview] pvpython vtkMPIController usage? rank is always 0

2017-05-16 Thread Andy Bauer
Note that you can run both the ParaView GUI and pvpython with MPI in order to initialize, finalize and use MPI functions but as Dave said, they should/will always be run with a single MPI process. The argument to run MPI with either is "--mpi". On Tue, May 16, 2017 at 1:11 PM, David E DeMarle

Re: [Paraview] pvpython vtkMPIController usage? rank is always 0

2017-05-16 Thread David E DeMarle
Try your script within pvbatch. pvpython is analogous to the Qt client application, it (usually) is not part of an MPI execution environment. Either one can connect to an MPI parallel pvserver. pvbatch is a python interface that is meant to be run on the server. It is directly connected to the

[Paraview] pvpython vtkMPIController usage? rank is always 0

2017-05-16 Thread Ephraim Obermaier
Hello, I am trying to use VTK's MPI communication from pvpython, running with OpenMPI's mpirun. It seems like ParaView hasn't enabled the MPI capabilities for VTK, although it was compiled from source with PARAVIEW_USE_MPI=ON and correctly found the system OpenMPI-2.0.0 libraries and includes. I