Ola Skavhaug wrote:
> Garth N. Wells skrev den 11/07-2008 følgende:
>>
>> Ola Skavhaug wrote:
>>> Garth N. Wells skrev den 10/07-2008 følgende:
>>>> When I have MPI installed and configure DOLFIN with either PETSc or  
>>>> Trilinos, I get a bunch of errors when plot() is called (see below).
>>>> A plot appears as expected, followed by the error messages. Looks 
>>>> like a Viper issue (?).
>>> As far as I know, there shouldn't be any MPI calls in viper/VTK. I have
>>> absolutely no idea what causes this error.
>>>
>>> Did you compile VTK with MPI? Could you report the output of this commando 
>>> on
>>> the shared vtk-libraries:
>>> objdump -R /usr/lib/libvtk*.so  | grep MPI
>>>
>> I didn't build VTK. I'm using the package. From objdump I get
>>
>>> objdump -R /usr/lib/libvtk*.so  | grep MPI
>> 000fc960 R_386_JUMP_SLOT  
>> _ZN24vtkDistributedDataFilter15MPIRedistributeEP10vtkDataSetS1_
>>
>> Garth
> 
> OK, then we have the vtk. As you can see, no MPI stuff is going on in viper,
> and the MPI_Attr_get does not live in vtk. I have a similar problem now when I
> try to combine dolfin with petsc bindings and the PyCC bindings to a serial
> build of HYPRE. The problems are likely to be caused by some dummy MPI calls
> in HYPRE that are mixed with the non-dummy versions of MPI in dolfin/PETSc.
> I don't know what causes the problem you describe.
> 
> Can you give a more elaborate description of your system build + which
> problems that fail?
> 

Fresh clone of dolfin-dev

   ./scons.local enablePetsc=yes enableTrilinos=no enableMpi=yes

Running the Poisson demo in demo/pde/poisson/cpp/ and I get the 
MPI-related error messages. I get the same messages running 
demo/pde/poisson/python/demo.py.

I'm using OpenMPI 1.2.6 which I built and installed, and the Ubuntu 8.04 
VTK packages.

Garth


> Ola
> 
>>> Ola
>>>
>>>  
>>>> Garth
>>>>
>>>> Plotting Function, press q to continue...
>>>> *** An error occurred in MPI_Attr_get
>>>> *** after MPI was finalized
>>>> *** MPI_ERRORS_ARE_FATAL (goodbye)
>>>> [gnw20pc:3330] Abort before MPI_INIT completed successfully; not able 
>>>> to guarantee that all other processes were killed!
>>>> *** An error occurred in MPI_Comm_rank
>>>> *** after MPI was finalized
>>>> *** MPI_ERRORS_ARE_FATAL (goodbye)
>>>> *** An error occurred in MPI_Type_free
>>>> *** after MPI was finalized
>>>> *** MPI_ERRORS_ARE_FATAL (goodbye)
>>>> Segmentation fault
>>>> Unable to plot (PyDOLFIN or Viper plotter not available).
>>>> Saved function u (discrete function) to file poisson.pvd in VTK format.
>>>> _______________________________________________
>>>> DOLFIN-dev mailing list
>>>> [email protected]
>>>> http://www.fenics.org/mailman/listinfo/dolfin-dev

_______________________________________________
DOLFIN-dev mailing list
[email protected]
http://www.fenics.org/mailman/listinfo/dolfin-dev

Reply via email to