This is happening because PETSc was configured to compile with the C 
compiler and use C complex numbers, meanwhile your main program is c++ so 
doesn’t appreciate the C version of complex numbers (yes, C and C++ are not as 
friendly as they should be). 

   I’ll try a build and see if I can fix this.

   You can run the same configure but use the additional ./configure option 
—with-clanguage=c++  (two dashes before the with) and it should work because 
then everything will get built with the C++ compiler and using C++ complex.

   Barry

On Jul 19, 2014, at 5:16 PM, Francis Poulin <fpou...@uwaterloo.ca> wrote:

> Hello again,
> 
> My installation on my desktop seems to work well.
> 
> I am having problems with my installation of petsc on my mac though.  It 
> seems to go through the installation alright and even shows me the nice graph 
> of the performance.  I used,
> 
> ./configure --with-scalar-type=complex --with-cc=clang --with-cxx=clang++  
> --with-fc=gfortran --with-c2html=0 --download-mpich
> 
> However, when I try compiling the main.cpp I get an error, which I copied 
> below.  I know this should work and wonder what my petsc installation is not 
> finding.  Any suggestions?
> 
> Cheers, Francis
> 
> 
> TEST_MPI fpoulin$ make all -s
> clang: warning: argument unused during compilation: 
> '-L/Users/fpoulin/software/petsc/lib'
> clang: warning: argument unused during compilation: 
> '-L/Users/fpoulin/software/petsc/darwin10.6.0-c-debug/lib'
> In file included from main.cpp:17:
> In file included from /Users/fpoulin/software/petsc/include/petsc.h:5:
> In file included from /Users/fpoulin/software/petsc/include/petscbag.h:4:
> In file included from /Users/fpoulin/software/petsc/include/petscsys.h:366:
> /Users/fpoulin/software/petsc/include/petscmath.h:562:10: error: use of 
> undeclared identifier 'cpow'; did you mean 'pow'?
>  return PetscPowScalar(base,cpower);
>         ^
> /Users/fpoulin/software/petsc/include/petscmath.h:252:31: note: expanded from 
> macro 'PetscPowScalar'
> #define PetscPowScalar(a,b)   PetscPowComplex(a,b)
>                              ^
> /Users/fpoulin/software/petsc/include/petscmath.h:176:38: note: expanded from 
> macro 'PetscPowComplex'
> #define PetscPowComplex(a,b)         cpow(a,b)
>                                     ^
> /usr/include/math.h:443:15: note: 'pow' declared here
> extern double pow(double, double);
>              ^
> 1 error generated.
> make: *** [build/main.o] Error 1
> 
> 
> 
> ------------------
> Francis Poulin
> Associate Professor
> Associate Chair, Undergraduate Studies
> Department of Applied Mathematics
> University of Waterloo
> 
> email:           fpou...@uwaterloo.ca
> Web:            https://uwaterloo.ca/poulin-research-group/
> Telephone:  +1 519 888 4567 x32637
> 
> 
> ________________________________________
> From: Francis Poulin
> Sent: Saturday, July 19, 2014 2:09 PM
> To: petsc-users
> Subject: RE: [petsc-users] Question about PETSc installs and MPI
> 
> Hello Satish,
> 
> Thanks for the help on both counts.
> 
> 1) I installed csh and petscmpiexec works, but I think I will stick to mpirun
> 
> 2) Your modifications of the code make things work beautifully!
> 
> Again, sorry to bug you but the help is greatly appreciated.
> 
> Cheers, Francis
> 
> ------------------
> Francis Poulin
> Associate Professor
> Associate Chair, Undergraduate Studies
> Department of Applied Mathematics
> University of Waterloo
> 
> email:           fpou...@uwaterloo.ca
> Web:            https://uwaterloo.ca/poulin-research-group/
> Telephone:  +1 519 888 4567 x32637
> 
> 
> ________________________________________
> From: Satish Balay [ba...@mcs.anl.gov]
> Sent: Saturday, July 19, 2014 11:52 AM
> To: petsc-users
> Cc: Francis Poulin
> Subject: Re: [petsc-users] Question about PETSc installs and MPI
> 
> And it appears the original code was tested with 'real' numbers and 
> petsc-3.4. It was
> not really tested with complex numbers. I had to make the following changes 
> for
> it to run with complex & petsc-3.5
> 
> Satish
> 
> ---------
> 
> balay@es^/scratch/balay/test $ ../petsc-3.5.0/arch-linux2-c-debug/bin/mpiexec 
> -n 8 valgrind --tool=memcheck -q ./main
> Dot Product Check...
> 
>    Dot Product (should be 2*256) = 512.000000
> 
> Done.
> 
> Derivative Check...
> 
>           Min (should be 0; if halos not defined properly, would be 
> -1.0/dDeltaX) = 0.000000
>           Max (should be 0; if halos not defined properly, would be 
> +1.0/dDeltaX) = 0.000000
> 
> Done.
> balay@es^/scratch/balay/test $ diff -Nru main.cpp.orig main.cpp
> --- main.cpp.orig            2014-07-19 10:48:21.072486076 -0500
> +++ main.cpp                 2014-07-19 10:47:26.977985006 -0500
> @@ -31,9 +31,9 @@
>   // DMDA Environment
>   //////////////////////////////////////////////////////////////////////////
>   // Boundary Conditions
> -  DMDABoundaryType bx = DMDA_BOUNDARY_GHOSTED;
> -  DMDABoundaryType by = DMDA_BOUNDARY_NONE; // no boundary points since nGx 
> = 1
> -  DMDABoundaryType bz = DMDA_BOUNDARY_NONE; // no boundary points since nGz 
> = 1
> +  DMBoundaryType bx = DM_BOUNDARY_GHOSTED;
> +  DMBoundaryType by = DM_BOUNDARY_NONE; // no boundary points since nGx = 1
> +  DMBoundaryType bz = DM_BOUNDARY_NONE; // no boundary points since nGz = 1
> 
>        
> //////////////////////////////////////////////////////////////////////////
>        // Stencil type
> @@ -96,7 +96,7 @@
>   DMDAGetCorners(m_3da, &si, 0, 0, &ei, 0, 0);
> 
>        // Access the elements of the local arrays as C++ multi-dim. array 
> structures
> -       double ***pvecA, ***pvecB;
> +       PetscScalar ***pvecA, ***pvecB;
>        DMDAVecGetArray(m_3da, m_vecA, &pvecA);
>        DMDAVecGetArray(m_3da, m_vecB, &pvecB);
> 
> @@ -132,7 +132,7 @@
>   VecAssemblyEnd(m_gTempB);
> 
>        // Take dot product
> -       double dDotProduct;
> +       PetscScalar dDotProduct;
>        VecDot(m_gTempA, m_gTempB, &dDotProduct);
> 
>        // Output dot product to check
> @@ -163,7 +163,7 @@
>   // Alternative (local vector): VecDuplicate(m_vecA, &m_vecDx);
> 
>        // Access the data in the vectors by using pointers
> -       double ***pDx;
> +       PetscScalar ***pDx;
>        DMDAVecGetArray(m_3da, m_vecA, &pvecA);
>        DMDAVecGetArray(m_3da, m_vecDx, &pDx);
> 
> balay@es^/scratch/balay/test $
> 
> 
> 
> On Sat, 19 Jul 2014, Satish Balay wrote:
> 
>> On Sat, 19 Jul 2014, Francis Poulin wrote:
>> 
>>> Hello Barry,
>>> 
>>> I was one of the two people that had difficulties with getting the correct 
>>> results with John's code.  Previously, I didn't have valgrind installed so 
>>> I installed it using apt-get.  Then I configured it using the following:
>>> 
>>> ./configure --with-scalar-type=complex --with-cc=gcc -—with-cxx=c++ 
>>> --with-fc=gfortran --with-c2html=0 --download-mpich --download-scalapack 
>>> --download-hypre
>>> 
>>> This is on ubuntu and is different from what I tried before in that now I 
>>> am downloading mpich, scalapack and hypre.   I decided to download 
>>> scalapack since that seems like it could be useful.  I was told that HYPRE 
>>> doesn't work with complex variables.  Too bad, but not a big deal.
>> 
>> PETSc does not use scalapack. Its useful only if you are using mumps..
>> 
>> --download-metis --download-parmetis --download-scalapack --download-mumps
>> 
>>> 
>>> It completes the configure, make all and make test and even gives me the 
>>> figures of the parallel efficiency (or not quite efficiency maybe).  I 
>>> didn't catch any errors, but there are possible errors in the log.  When I 
>>> went to try making an example I found that I can't use petscmpiexec to run 
>>> anything in serial or parallel.
>>> 
>>> fpoulin@vortex:~/software/petsc/src/ts/examples/tutorials$ 
>>> /home/fpoulin/software/petsc/bin/petscmpiexec -n 1 ./ex1
>>> -bash: /home/fpoulin/software/petsc/bin/petscmpiexec: /bin/csh: bad 
>>> interpreter: No such file or directory
>> 
>> Perhaps you do not have csh installed on this machine. You can use mpiexec 
>> directly
>> 
>> ./ex1
>> /home/fpoulin/software/petsc/arch-linux2-c-debug/bin/mpiexec -n 1 ./ex1
>> /home/fpoulin/software/petsc/arch-linux2-c-debug/bin/mpiexec -n 2 ./ex1
>> 
>> Satish
>> 
>>> 
>>> I am sorry to bother you with this.  I am also having issues with my 
>>> installation on my mac but I thought if i can figure this one out then 
>>> maybe I will have a better idea what's wrong with the other.
>>> 
>>> Thank,
>>> Francis
>>> 
>>> 
>>> ------------------
>>> Francis Poulin
>>> Associate Professor
>>> Associate Chair, Undergraduate Studies
>>> Department of Applied Mathematics
>>> University of Waterloo
>>> 
>>> email:           fpou...@uwaterloo.ca
>>> Web:            https://uwaterloo.ca/poulin-research-group/
>>> Telephone:  +1 519 888 4567 x32637
>>> 
>>> 
>>> ________________________________________
>>> From: Barry Smith [bsm...@mcs.anl.gov]
>>> Sent: Friday, July 18, 2014 9:57 PM
>>> To: John Yawney
>>> Cc: petsc-users@mcs.anl.gov; Francis Poulin; Kim Usi
>>> Subject: Re: [petsc-users] Question about PETSc installs and MPI
>>> 
>>>   I ran the program on linux with 1,2, 4 processes under valgrind for both 
>>> types of boundary conditions and it ran fine.
>>> 
>>>   Suggest your colleagues do a test configure of PETSc using 
>>> —download-mpich and see if they still get the problem or if it runs ok.
>>> 
>>>   The can also run with valgrind and see what it reports. 
>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
>>> 
>>>   Barry
>>> 
>>> On Jul 18, 2014, at 8:16 PM, John Yawney <jyawney...@gmail.com> wrote:
>>> 
>>>> Hello,
>>>> 
>>>> I had a question about PETSc installations. On my local computer I 
>>>> configured PETSc (v 3.4.2) using the options:
>>>> 
>>>> ./configure --with-cc=mpicc --with-cxx=mpic++ --download-f-blas-lapack 
>>>> --download-mpich --download-hypre
>>>> 
>>>> I wrote a test program that defines a vector using DMDAs, computes a dot 
>>>> product, exchanges halo elements, and computes a low-order FD derivative 
>>>> of the vector. Under my installation of PETSc everything works fine. For 
>>>> some reason, when my colleagues run the program, they get segmentation 
>>>> fault errors. If they change the y and z boundary types to GHOSTED as 
>>>> well, they get the program to run until the end (seg faults at the end 
>>>> though) but they get a local value of the dot product. I've attached the 
>>>> main.cpp file for this script.
>>>> 
>>>> When they installed their versions of PETSc they didn't use the 
>>>> --download-mpich option but instead used either:
>>>> ./configure --download-f-blas-lapack --with-scalar-type=complex
>>>> or with the option: 
>>>> --with-mpi-dir=/home/kim/anaconda/pkgs/mpich2-1.3-py27_0
>>>> 
>>>> Could this be causing a problem with the parallelization under PETSc?
>>>> 
>>>> Thanks for the help and sorry for the long question.
>>>> 
>>>> Best regards,
>>>> John
>>>> <main.cpp>
>>> 
>>> 
>> 

Reply via email to