Re: [petsc-users] Meaning of error message (gamg & fieldsplit related)

2016-08-16 Thread Barry Smith

  I first set the option -ksp_converged_reason and obtained the message

$ ./ex1 
Residual norms for fieldsplit_PA_ solve.
0 KSP unpreconditioned resid norm 2.288897733893e+03 true resid norm 
2.288897733893e+03 ||r(i)||/||b|| 1.e+00
Linear solve did not converge due to DIVERGED_PCSETUP_FAILED iterations 0
   PCSETUP_FAILED due to SUBPC_ERROR 

so it is failing in building the preconditioner  since it didn't even work for 
the first fieldsplit I guessed that it failed in the gamg on the first 
fieldsplit so ran with

-fieldsplit_PA_ksp_error_if_not_converged
 
and got

$ ./ex1 
Residual norms for fieldsplit_PA_ solve.
0 KSP unpreconditioned resid norm 2.288897733893e+03 true resid norm 
2.288897733893e+03 ||r(i)||/||b|| 1.e+00
Linear solve did not converge due to DIVERGED_PCSETUP_FAILED iterations 0
   PCSETUP_FAILED due to SUBPC_ERROR 
~/Src/petsc/test-dir (master=) arch-master-basic
$ ./ex1 -fieldsplit_PA_ksp_error_if_not_converged
[0]PETSC ERROR: - Error Message 
--
[0]PETSC ERROR:   
[0]PETSC ERROR: KSPSolve has not converged
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for 
trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.7.3-1182-g00a02d5  GIT Date: 
2016-08-16 15:09:17 -0500
[0]PETSC ERROR: ./ex1 on a arch-master-basic named Barrys-MacBook-Pro.local by 
barrysmith Wed Aug 17 00:35:16 2016
[0]PETSC ERROR: Configure options --with-mpi-dir=/Users/barrysmith/libraries
[0]PETSC ERROR: #1 KSPSolve() line 850 in 
/Users/barrysmith/Src/petsc/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: #2 PCGAMGOptProlongator_AGG() line 1221 in 
/Users/barrysmith/Src/petsc/src/ksp/pc/impls/gamg/agg.c

looking at the code and seeing the options prefix for this KSP solve I ran with 

-fieldsplit_PA_ksp_error_if_not_converged 
-fieldsplit_PA_gamg_est_ksp_monitor_true_residual 

and got

$ ./ex1 -fieldsplit_PA_ksp_error_if_not_converged 
-fieldsplit_PA_gamg_est_ksp_monitor_true_residual 
  Residual norms for fieldsplit_PA_gamg_est_ solve.
  0 KSP none resid norm 7.030417576826e+07 true resid norm 1.006594247197e+01 
||r(i)||/||b|| 1.e+00
  1 KSP none resid norm 6.979279406029e+07 true resid norm 1.150138107009e+01 
||r(i)||/||b|| 1.142603497100e+00
  2 KSP none resid norm 6.979246564783e+07 true resid norm 6.970771666727e+03 
||r(i)||/||b|| 6.925105807166e+02
  3 KSP none resid norm 6.978033367036e+07 true resid norm 9.958555706490e+02 
||r(i)||/||b|| 9.893316730368e+01
  4 KSP none resid norm 6.977995917588e+07 true resid norm 1.095475380870e+04 
||r(i)||/||b|| 1.088298869103e+03
  5 KSP none resid norm 6.954940040289e+07 true resid norm 2.182804459638e+04 
||r(i)||/||b|| 2.168504802919e+03
  6 KSP none resid norm 6.905975832912e+07 true resid norm 4.557801389945e+04 
||r(i)||/||b|| 4.527943014412e+03
  7 KSP none resid norm 6.905788649989e+07 true resid norm 4.476060162996e+04 
||r(i)||/||b|| 4.446737278162e+03
  8 KSP none resid norm 5.464732207984e+07 true resid norm 7.801211607942e+04 
||r(i)||/||b|| 7.750105496491e+03
  9 KSP none resid norm 5.393328767072e+07 true resid norm 8.529925739695e+04 
||r(i)||/||b|| 8.474045787013e+03
 10 KSP none resid norm 5.294387823310e+07 true resid norm 8.380999411358e+04 
||r(i)||/||b|| 8.326095082197e+03
[0]PETSC ERROR: - Error Message 
--
[0]PETSC ERROR:   
[0]PETSC ERROR: KSPSolve has not converged


the smoother is simply not working AT ALL on your first sub matrix so I ran 
with -fieldsplit_PA_gamg_est_ksp_pmat_view and loaded the resulting matrix from 
the file binaryoutput into matlab and checked its eigenvalues

>> e = eig(full(a))

e =

   5.5260 +85.4938i
   5.5260 -85.4938i
 -11.6673 + 0.i
  -6.5409 + 0.i
   6.5240 + 0.i
  -2.5377 + 2.0951i
  -2.5377 - 2.0951i
   3.0712 + 0.i
  -0.5365 + 0.9521i
  -0.5365 - 0.9521i
   1.0710 + 0.i
  -0.9334 + 0.i
   0.7608 + 0.i
   0.4337 + 0.i
  -0.4558 + 0.i
  -0.4011 + 0.i
   0.1212 + 0.i
   0.0327 + 0.0338i
   0.0327 - 0.0338i
  -0.0426 + 0.i
  -0.0360 + 0.0334i
  -0.0360 - 0.0334i
   0.0308 + 0.i
  -0.0003 + 0.0327i
  -0.0003 - 0.0327i
  -0.0323 + 0.i
   0.0120 + 0.0185i
   0.0120 - 0.0185i
  -0.0256 + 0.i
  -0.0225 + 0.i
   0.0152 + 0.i
  -0.0083 + 0.0125i
  -0.0083 - 0.0125i
  -0.0177 + 0.i
  -0.0175 + 0.i
  -0.0177 + 0.i
  -0.0158 + 0.i
  -0.0176 + 0.i
  -0.0136 + 0.0038i
  -0.0136 - 0.0038i
   0.0125 + 0.i
  -0.0080 + 0.0069i
  -0.0080 - 0.0069i
   0.0066 + 0.0075i
   0.0066 - 0.0075i
   0.0097 + 0.i
   0.0039 + 0.0085i
   0.0039 - 0.0085i
   0.0070 + 0.i
  -0.0095 + 0.0011i
  -0.0095 - 0.0011i
  -0.0064 + 0.i
   0.0024 + 0.0036i
   0.0024 - 0.0036i
   0.0042 + 0.i
   0.0042 - 0.i
   0.0040 + 0.i
  -0.0035 + 0.0021i
  -0.0035 - 0.0021i
  

Re: [petsc-users] A question about DMPlexDistribute

2016-08-16 Thread Julian Andrej
The maint branch is a more or less "stable" branch. Features go into
"master" at first then get merged into "maint".

So

git clone -b master https://bitbucket.org/petsc/petsc petsc

will always be the most recent code you can possibly obtain.

On Wed, Aug 17, 2016 at 3:48 AM, leejearl  wrote:
> Thank you for your help. The problem has been overcome following your
> advices.
>
> Now, I give some notes for this problem which might be useful for others.
>
> 1. Up to now, we must use the code in the master branch. The source code can
> be downloaded using git with:
>
> git clone https://bitbucket.org/petsc/petsc petsc.
>
> 2. Just the other day, I download the code using git with: git clone -b
> maint https://bitbucket.org/petsc/petsc petsc.
>
> But, this copy of code still has such a problem. I am not sure whether
> it is available now.
>
> 3. The code downloaded from website of PETSc(Version of 3.7.3) is not
> available for this problem, although it is in the branch master.
>
>
> Thanks for all helps again. Best wishes to you.
>
> leejearl
>
>
>
>
> On 2016年08月16日 19:34, Matthew Knepley wrote:
>
> On Mon, Aug 15, 2016 at 9:28 PM, leejearl  wrote:
>>
>> Thank you for all your helps. I have tried and reinstalled PETSc lots of
>> times. The error are still existed.
>>
>> I can not find out the reasons. Now, I give the some messages in this
>> letter.
>>
>> 1> The source code is downloaded from the website of PETSc, and the
>> version is 3.7.2.
>
> You will need to run in the 'master' branch, not the release since we have
> fixed some bugs. It is best to
> use master for very new features like this. The instructions are here:
> http://www.mcs.anl.gov/petsc/developers/index.html
>
>Thanks,
>
>   Matt
>>
>> 2> Configure:
>>
>> >export PETSC_DIR=./
>>
>> >export PETSC_ARCH=arch
>>
>> >./configure --prefix=$HOME/Install/petsc
>> > --with-mpi-dir=/home/leejearl/Install/mpich_3.1.4/gnu
>> > --download-exodusii=../externalpackages/exodus-5.24.tar.bz2
>> > --download-netcdf=../externalpackages/netcdf-4.3.2.tar.gz
>> > --download-hdf5=../externalpackages/hdf5-1.8.12.tar.gz
>> > --download-metis=../externalpackages/git.metis.tar.gz
>> > --download-parmetis=yes
>>
>> 3> The process of installation has no error.
>>
>> 4> After the installation, I added the following statement into the file
>> ~/.bashrc:
>>
>> export PETSC_ARCH=""
>>
>> export PETSC_DIR=$HOME/Install/pets/
>>
>> I wish to get some helps as follows:
>>
>> 1> Is there any problems in my installation?
>>
>> 2> Can any one help me a simple code in which the value of overlap used in
>> DMPlexDistribute function is greater than 1.
>>
>> 3> I attach the code, makefile, grid and the error messages again, I hope
>> some one can help me to figure out the problems.
>>
>> 3.1> code: cavity.c
>>
>> 3.2> makefile: makefile
>>
>> 3.3> grid: cavity.exo
>>
>> 3.4> error messages: error.dat
>>
>> It is very strange that there is no error message when I run it using
>> "mpirun -n 3 ./cavity", but when I run it using "mpirun -n 2 ./cavity", the
>> errors happed.
>>
>> The error messages are shown in the file error.dat.
>>
>>
>> Any helps are appreciated.
>>
>>
>>
>> On 2016年08月13日 09:04, leejearl wrote:
>>
>> Thank you for your reply. The source code I have used is from the website
>> of PETSc, not from the git repository.
>>
>> I will have a test with the code from git repository.
>>
>>
>> leejearl
>>
>>
>> On 2016年08月13日 08:49, Oxberry, Geoffrey Malcolm wrote:
>>
>>
>> On Aug 12, 2016, at 5:41 PM, leejearl  wrote:
>>
>> Hi, Matt:
>>
>>
>>
>> > Can you verify that you are running the master branch?
>>
>>
>> cd ${PETSC_DIR}
>> git branch
>>
>> The last command should return something like a list of branch names, and
>> the branch name with an asterisk to the left of it will be the branch you
>> are currently on.
>>
>> Geoff
>>
>> I am not sure, how can I verify this?
>> And I configure PETSc with this command
>> "./configure --prefix=$HOME/Install/petsc-openmpi
>> --with-mpi=/home/leejearl/Install/openmpi/gnu/1.8.4 --download-exodusii=yes
>> --download-netcdf --with-hdf5-dir=/home/leejearl/Install/hdf5-1.8.14
>> --download-metis=yes".
>> Is there some problem? Can you show me your command for configuring PETSc?
>>
>>
>> Thanks
>>
>> leejearl
>>
>>
>>
>>
>>
>> On 2016年08月13日 01:10, Matthew Knepley wrote:
>>
>> On Thu, Aug 11, 2016 at 8:00 PM, leejearl  wrote:
>>>
>>> Thank you for your reply. I have attached the code, grid and the error
>>> message.
>>>
>>> cavity.c is the code file, cavity.exo is the grid, and error.dat is the
>>> error message.
>>>
>>> The command is "mpirun -n 2 ./cavity
>>
>>
>> Can you verify that you are running the master branch? I just ran this and
>> got
>>
>> DM Object: 2 MPI processes
>>   type: plex
>> DM_0x8404_0 in 2 dimensions:
>>   0-cells: 5253 5252
>>   1-cells: 10352 10350
>>   2-cells: 5298 (198) 5297 (198)
>> Labels:
>>   ghost: 2 

Re: [petsc-users] A question about DMPlexDistribute

2016-08-16 Thread leejearl
Thank you for your help. The problem has been overcome following your 
advices.


Now, I give some notes for this problem which might be useful for others.

1. Up to now, we must use the code in the master branch. The source code 
can be downloaded using git with:


git clone https://bitbucket.org/petsc/petsc petsc.

2. Just the other day, I download the code using git with: git clone -b 
maint https://bitbucket.org/petsc/petsc petsc.


But, this copy of code still has such a problem. I am not sure 
whether it is available now.


3. The code downloaded from website of PETSc(Version of 3.7.3) is not 
available for this problem, although it is in the branch master.



Thanks for all helps again. Best wishes to you.

leejearl




On 2016年08月16日 19:34, Matthew Knepley wrote:
On Mon, Aug 15, 2016 at 9:28 PM, leejearl > wrote:


Thank you for all your helps. I have tried and reinstalled PETSc
lots of times. The error are still existed.

I can not find out the reasons. Now, I give the some messages in
this letter.

1> The source code is downloaded from the website of PETSc, and
the version is 3.7.2.

You will need to run in the 'master' branch, not the release since we 
have fixed some bugs. It is best to
use master for very new features like this. The instructions are here: 
http://www.mcs.anl.gov/petsc/developers/index.html


   Thanks,

  Matt

2> Configure:

>export PETSC_DIR=./

>export PETSC_ARCH=arch

>./configure --prefix=$HOME/Install/petsc
--with-mpi-dir=/home/leejearl/Install/mpich_3.1.4/gnu
--download-exodusii=../externalpackages/exodus-5.24.tar.bz2
--download-netcdf=../externalpackages/netcdf-4.3.2.tar.gz
--download-hdf5=../externalpackages/hdf5-1.8.12.tar.gz
--download-metis=../externalpackages/git.metis.tar.gz
--download-parmetis=yes

3> The process of installation has no error.

4> After the installation, I added the following statement into
the file ~/.bashrc:

export PETSC_ARCH=""

export PETSC_DIR=$HOME/Install/pets/

I wish to get some helps as follows:

1> Is there any problems in my installation?

2> Can any one help me a simple code in which the value of overlap
used in DMPlexDistribute function is greater than 1.

3> I attach the code, makefile, grid and the error messages again,
I hope some one can help me to figure out the problems.

3.1> code: cavity.c

3.2> makefile: makefile

3.3> grid: cavity.exo

3.4> error messages: error.dat

It is very strange that there is no error message when I run
it using "mpirun -n 3 ./cavity", but when I run it using "mpirun
-n 2 ./cavity", the errors happed.

The error messages are shown in the file error.dat.


Any helps are appreciated.



On 2016年08月13日 09:04, leejearl wrote:


Thank you for your reply. The source code I have used is from the
website of PETSc, not from the git repository.

I will have a test with the code from git repository.


leejearl


On 2016年08月13日 08:49, Oxberry, Geoffrey Malcolm wrote:



On Aug 12, 2016, at 5:41 PM, leejearl > wrote:

Hi, Matt:



> Can you verify that you are running the master branch?


cd ${PETSC_DIR}
git branch

The last command should return something like a list of branch
names, and the branch name with an asterisk to the left of it
will be the branch you are currently on.

Geoff


I am not sure, how can I verify this?
And I configure PETSc with this command
"./configure --prefix=$HOME/Install/petsc-openmpi
--with-mpi=/home/leejearl/Install/openmpi/gnu/1.8.4
--download-exodusii=yes --download-netcdf
--with-hdf5-dir=/home/leejearl/Install/hdf5-1.8.14
--download-metis=yes".
Is there some problem? Can you show me your command for
configuring PETSc?


Thanks

leejearl





On 2016年08月13日 01:10, Matthew Knepley wrote:

On Thu, Aug 11, 2016 at 8:00 PM, leejearl > wrote:

Thank you for your reply. I have attached the code, grid
and the error message.

cavity.c is the code file, cavity.exo is the grid, and
error.dat is the error message.

The command is "mpirun -n 2 ./cavity


Can you verify that you are running the master branch? I just
ran this and got

DM Object: 2 MPI processes
  type: plex
DM_0x8404_0 in 2 dimensions:
  0-cells: 5253 5252
  1-cells: 10352 10350
  2-cells: 5298 (198) 5297 (198)
Labels:
  ghost: 2 strata of sizes (199, 400)
  vtk: 1 strata of sizes (4901)
  Cell Sets: 1 strata of sizes (5100)
  Face Sets: 3 strata of sizes (53, 99, 50)
  depth: 3 strata of sizes (5253, 10352, 5298)

  Thanks,

 Matt

On 2016年08月11日 23:29, Matthew Knepley wrote:

On Thu, 

Re: [petsc-users] A question about DMPlexDistribute

2016-08-16 Thread Matthew Knepley
On Mon, Aug 15, 2016 at 9:28 PM, leejearl  wrote:

> Thank you for all your helps. I have tried and reinstalled PETSc lots of
> times. The error are still existed.
>
> I can not find out the reasons. Now, I give the some messages in this
> letter.
>
> 1> The source code is downloaded from the website of PETSc, and the
> version is 3.7.2.
>
> You will need to run in the 'master' branch, not the release since we have
fixed some bugs. It is best to
use master for very new features like this. The instructions are here:
http://www.mcs.anl.gov/petsc/developers/index.html

   Thanks,

  Matt

> 2> Configure:
>
> >export PETSC_DIR=./
>
> >export PETSC_ARCH=arch
>
> >./configure --prefix=$HOME/Install/petsc 
> >--with-mpi-dir=/home/leejearl/Install/mpich_3.1.4/gnu
> --download-exodusii=../externalpackages/exodus-5.24.tar.bz2
> --download-netcdf=../externalpackages/netcdf-4.3.2.tar.gz
> --download-hdf5=../externalpackages/hdf5-1.8.12.tar.gz
> --download-metis=../externalpackages/git.metis.tar.gz
> --download-parmetis=yes
>
> 3> The process of installation has no error.
>
> 4> After the installation, I added the following statement into the file
> ~/.bashrc:
>
> export PETSC_ARCH=""
>
> export PETSC_DIR=$HOME/Install/pets/
>
> I wish to get some helps as follows:
>
> 1> Is there any problems in my installation?
>
> 2> Can any one help me a simple code in which the value of overlap used in
> DMPlexDistribute function is greater than 1.
>
> 3> I attach the code, makefile, grid and the error messages again, I hope
> some one can help me to figure out the problems.
>
> 3.1> code: cavity.c
>
> 3.2> makefile: makefile
>
> 3.3> grid: cavity.exo
>
> 3.4> error messages: error.dat
>
> It is very strange that there is no error message when I run it using
> "mpirun -n 3 ./cavity", but when I run it using "mpirun -n 2 ./cavity", the
> errors happed.
>
> The error messages are shown in the file error.dat.
>
>
> Any helps are appreciated.
>
>
>
> On 2016年08月13日 09:04, leejearl wrote:
>
> Thank you for your reply. The source code I have used is from the website
> of PETSc, not from the git repository.
>
> I will have a test with the code from git repository.
>
>
> leejearl
>
> On 2016年08月13日 08:49, Oxberry, Geoffrey Malcolm wrote:
>
>
> On Aug 12, 2016, at 5:41 PM, leejearl  wrote:
>
> Hi, Matt:
>
>
> > Can you verify that you are running the master branch?
>
>
> cd ${PETSC_DIR}
> git branch
>
> The last command should return something like a list of branch names, and
> the branch name with an asterisk to the left of it will be the branch you
> are currently on.
>
> Geoff
>
> I am not sure, how can I verify this?
> And I configure PETSc with this command
> "./configure --prefix=$HOME/Install/petsc-openmpi
> --with-mpi=/home/leejearl/Install/openmpi/gnu/1.8.4
> --download-exodusii=yes --download-netcdf 
> --with-hdf5-dir=/home/leejearl/Install/hdf5-1.8.14
> --download-metis=yes".
> Is there some problem? Can you show me your command for configuring PETSc?
>
>
> Thanks
>
> leejearl
>
>
>
>
>
> On 2016年08月13日 01:10, Matthew Knepley wrote:
>
> On Thu, Aug 11, 2016 at 8:00 PM, leejearl  wrote:
>
>> Thank you for your reply. I have attached the code, grid and the error
>> message.
>>
>> cavity.c is the code file, cavity.exo is the grid, and error.dat is the
>> error message.
>>
>> The command is "mpirun -n 2 ./cavity
>>
>
> Can you verify that you are running the master branch? I just ran this and
> got
>
> DM Object: 2 MPI processes
>   type: plex
> DM_0x8404_0 in 2 dimensions:
>   0-cells: 5253 5252
>   1-cells: 10352 10350
>   2-cells: 5298 (198) 5297 (198)
> Labels:
>   ghost: 2 strata of sizes (199, 400)
>   vtk: 1 strata of sizes (4901)
>   Cell Sets: 1 strata of sizes (5100)
>   Face Sets: 3 strata of sizes (53, 99, 50)
>   depth: 3 strata of sizes (5253, 10352, 5298)
>
>   Thanks,
>
>  Matt
>
>
>> On 2016年08月11日 23:29, Matthew Knepley wrote:
>>
>> On Thu, Aug 11, 2016 at 3:14 AM, leejearl  wrote:
>>
>>> Hi,
>>> Thank you for your reply. It help me very much.
>>> But, for "/petsc-3.7.2/src/ts/examples/tutorials/ex11.c", when I
>>> set the overlap to 2 levels, the command is
>>> "mpirun -n 3 ./ex11 -f annulus-20.exo -ufv_mesh_overlap 2 -physics sw",
>>> it suffers a error.
>>> It seems to me that setting overlap to 2 is very common. Are there
>>> issues that I have not take into consideration?
>>> Any help are appreciated.
>>>
>> I will check this out. I have not tested an overlap of 2 here since I
>> generally use nearest neighbor FV methods for
>> unstructured stuff. I have test examples that run fine for overlap > 1.
>> Can you send the entire error message?
>>
>> If the error is not in the distribution, but rather in the analytics,
>> that is understandable because this example is only
>> intended to be run using a nearest neighbor FV method, and thus might be
>> confused if we give it two layers of ghost
>>