I ran the program on linux with 1,2, 4 processes under valgrind for both 
types of boundary conditions and it ran fine. 

   Suggest your colleagues do a test configure of PETSc using —download-mpich 
and see if they still get the problem or if it runs ok.

   The can also run with valgrind and see what it reports. 
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind

   Barry

On Jul 18, 2014, at 8:16 PM, John Yawney <jyawney...@gmail.com> wrote:

> Hello,
> 
> I had a question about PETSc installations. On my local computer I configured 
> PETSc (v 3.4.2) using the options:
> 
> ./configure --with-cc=mpicc --with-cxx=mpic++ --download-f-blas-lapack 
> --download-mpich --download-hypre
> 
> I wrote a test program that defines a vector using DMDAs, computes a dot 
> product, exchanges halo elements, and computes a low-order FD derivative of 
> the vector. Under my installation of PETSc everything works fine. For some 
> reason, when my colleagues run the program, they get segmentation fault 
> errors. If they change the y and z boundary types to GHOSTED as well, they 
> get the program to run until the end (seg faults at the end though) but they 
> get a local value of the dot product. I've attached the main.cpp file for 
> this script.
> 
> When they installed their versions of PETSc they didn't use the 
> --download-mpich option but instead used either:
> ./configure --download-f-blas-lapack --with-scalar-type=complex
> or with the option: --with-mpi-dir=/home/kim/anaconda/pkgs/mpich2-1.3-py27_0
> 
> Could this be causing a problem with the parallelization under PETSc?
> 
> Thanks for the help and sorry for the long question.
> 
> Best regards,
> John
> <main.cpp>

Reply via email to