Re: [petsc-users] A question about DMPlexDistribute

2016-08-16 Thread Julian Andrej
The maint branch is a more or less "stable" branch. Features go into
"master" at first then get merged into "maint".

So

git clone -b master https://bitbucket.org/petsc/petsc petsc

will always be the most recent code you can possibly obtain.

On Wed, Aug 17, 2016 at 3:48 AM, leejearl  wrote:
> Thank you for your help. The problem has been overcome following your
> advices.
>
> Now, I give some notes for this problem which might be useful for others.
>
> 1. Up to now, we must use the code in the master branch. The source code can
> be downloaded using git with:
>
> git clone https://bitbucket.org/petsc/petsc petsc.
>
> 2. Just the other day, I download the code using git with: git clone -b
> maint https://bitbucket.org/petsc/petsc petsc.
>
> But, this copy of code still has such a problem. I am not sure whether
> it is available now.
>
> 3. The code downloaded from website of PETSc(Version of 3.7.3) is not
> available for this problem, although it is in the branch master.
>
>
> Thanks for all helps again. Best wishes to you.
>
> leejearl
>
>
>
>
> On 2016年08月16日 19:34, Matthew Knepley wrote:
>
> On Mon, Aug 15, 2016 at 9:28 PM, leejearl  wrote:
>>
>> Thank you for all your helps. I have tried and reinstalled PETSc lots of
>> times. The error are still existed.
>>
>> I can not find out the reasons. Now, I give the some messages in this
>> letter.
>>
>> 1> The source code is downloaded from the website of PETSc, and the
>> version is 3.7.2.
>
> You will need to run in the 'master' branch, not the release since we have
> fixed some bugs. It is best to
> use master for very new features like this. The instructions are here:
> http://www.mcs.anl.gov/petsc/developers/index.html
>
>Thanks,
>
>   Matt
>>
>> 2> Configure:
>>
>> >export PETSC_DIR=./
>>
>> >export PETSC_ARCH=arch
>>
>> >./configure --prefix=$HOME/Install/petsc
>> > --with-mpi-dir=/home/leejearl/Install/mpich_3.1.4/gnu
>> > --download-exodusii=../externalpackages/exodus-5.24.tar.bz2
>> > --download-netcdf=../externalpackages/netcdf-4.3.2.tar.gz
>> > --download-hdf5=../externalpackages/hdf5-1.8.12.tar.gz
>> > --download-metis=../externalpackages/git.metis.tar.gz
>> > --download-parmetis=yes
>>
>> 3> The process of installation has no error.
>>
>> 4> After the installation, I added the following statement into the file
>> ~/.bashrc:
>>
>> export PETSC_ARCH=""
>>
>> export PETSC_DIR=$HOME/Install/pets/
>>
>> I wish to get some helps as follows:
>>
>> 1> Is there any problems in my installation?
>>
>> 2> Can any one help me a simple code in which the value of overlap used in
>> DMPlexDistribute function is greater than 1.
>>
>> 3> I attach the code, makefile, grid and the error messages again, I hope
>> some one can help me to figure out the problems.
>>
>> 3.1> code: cavity.c
>>
>> 3.2> makefile: makefile
>>
>> 3.3> grid: cavity.exo
>>
>> 3.4> error messages: error.dat
>>
>> It is very strange that there is no error message when I run it using
>> "mpirun -n 3 ./cavity", but when I run it using "mpirun -n 2 ./cavity", the
>> errors happed.
>>
>> The error messages are shown in the file error.dat.
>>
>>
>> Any helps are appreciated.
>>
>>
>>
>> On 2016年08月13日 09:04, leejearl wrote:
>>
>> Thank you for your reply. The source code I have used is from the website
>> of PETSc, not from the git repository.
>>
>> I will have a test with the code from git repository.
>>
>>
>> leejearl
>>
>>
>> On 2016年08月13日 08:49, Oxberry, Geoffrey Malcolm wrote:
>>
>>
>> On Aug 12, 2016, at 5:41 PM, leejearl  wrote:
>>
>> Hi, Matt:
>>
>>
>>
>> > Can you verify that you are running the master branch?
>>
>>
>> cd ${PETSC_DIR}
>> git branch
>>
>> The last command should return something like a list of branch names, and
>> the branch name with an asterisk to the left of it will be the branch you
>> are currently on.
>>
>> Geoff
>>
>> I am not sure, how can I verify this?
>> And I configure PETSc with this command
>> "./configure --prefix=$HOME/Install/petsc-openmpi
>> --with-mpi=/home/leejearl/Install/openmpi/gnu/1.8.4 --download-exodusii=yes
>> --download-netcdf --with-hdf5-dir=/home/leejearl/Install/hdf5-1.8.14
>> --download-metis=yes".
>> Is there some problem? Can you show me your command for configuring PETSc?
>>
>>
>> Thanks
>>
>> leejearl
>>
>>
>>
>>
>>
>> On 2016年08月13日 01:10, Matthew Knepley wrote:
>>
>> On Thu, Aug 11, 2016 at 8:00 PM, leejearl  wrote:
>>>
>>> Thank you for your reply. I have attached the code, grid and the error
>>> message.
>>>
>>> cavity.c is the code file, cavity.exo is the grid, and error.dat is the
>>> error message.
>>>
>>> The command is "mpirun -n 2 ./cavity
>>
>>
>> Can you verify that you are running the master branch? I just ran this and
>> got
>>
>> DM Object: 2 MPI processes
>>   type: plex
>> DM_0x8404_0 in 2 dimensions:
>>   0-cells: 5253 5252
>>   1-cells: 10352 10350
>>   2-cells: 5298 (198) 5297 (198)
>> Labels:
>>   ghost: 2 

Re: [petsc-users] A question about DMPlexDistribute

2016-08-16 Thread leejearl
Thank you for your help. The problem has been overcome following your 
advices.


Now, I give some notes for this problem which might be useful for others.

1. Up to now, we must use the code in the master branch. The source code 
can be downloaded using git with:


git clone https://bitbucket.org/petsc/petsc petsc.

2. Just the other day, I download the code using git with: git clone -b 
maint https://bitbucket.org/petsc/petsc petsc.


But, this copy of code still has such a problem. I am not sure 
whether it is available now.


3. The code downloaded from website of PETSc(Version of 3.7.3) is not 
available for this problem, although it is in the branch master.



Thanks for all helps again. Best wishes to you.

leejearl




On 2016年08月16日 19:34, Matthew Knepley wrote:
On Mon, Aug 15, 2016 at 9:28 PM, leejearl > wrote:


Thank you for all your helps. I have tried and reinstalled PETSc
lots of times. The error are still existed.

I can not find out the reasons. Now, I give the some messages in
this letter.

1> The source code is downloaded from the website of PETSc, and
the version is 3.7.2.

You will need to run in the 'master' branch, not the release since we 
have fixed some bugs. It is best to
use master for very new features like this. The instructions are here: 
http://www.mcs.anl.gov/petsc/developers/index.html


   Thanks,

  Matt

2> Configure:

>export PETSC_DIR=./

>export PETSC_ARCH=arch

>./configure --prefix=$HOME/Install/petsc
--with-mpi-dir=/home/leejearl/Install/mpich_3.1.4/gnu
--download-exodusii=../externalpackages/exodus-5.24.tar.bz2
--download-netcdf=../externalpackages/netcdf-4.3.2.tar.gz
--download-hdf5=../externalpackages/hdf5-1.8.12.tar.gz
--download-metis=../externalpackages/git.metis.tar.gz
--download-parmetis=yes

3> The process of installation has no error.

4> After the installation, I added the following statement into
the file ~/.bashrc:

export PETSC_ARCH=""

export PETSC_DIR=$HOME/Install/pets/

I wish to get some helps as follows:

1> Is there any problems in my installation?

2> Can any one help me a simple code in which the value of overlap
used in DMPlexDistribute function is greater than 1.

3> I attach the code, makefile, grid and the error messages again,
I hope some one can help me to figure out the problems.

3.1> code: cavity.c

3.2> makefile: makefile

3.3> grid: cavity.exo

3.4> error messages: error.dat

It is very strange that there is no error message when I run
it using "mpirun -n 3 ./cavity", but when I run it using "mpirun
-n 2 ./cavity", the errors happed.

The error messages are shown in the file error.dat.


Any helps are appreciated.



On 2016年08月13日 09:04, leejearl wrote:


Thank you for your reply. The source code I have used is from the
website of PETSc, not from the git repository.

I will have a test with the code from git repository.


leejearl


On 2016年08月13日 08:49, Oxberry, Geoffrey Malcolm wrote:



On Aug 12, 2016, at 5:41 PM, leejearl > wrote:

Hi, Matt:



> Can you verify that you are running the master branch?


cd ${PETSC_DIR}
git branch

The last command should return something like a list of branch
names, and the branch name with an asterisk to the left of it
will be the branch you are currently on.

Geoff


I am not sure, how can I verify this?
And I configure PETSc with this command
"./configure --prefix=$HOME/Install/petsc-openmpi
--with-mpi=/home/leejearl/Install/openmpi/gnu/1.8.4
--download-exodusii=yes --download-netcdf
--with-hdf5-dir=/home/leejearl/Install/hdf5-1.8.14
--download-metis=yes".
Is there some problem? Can you show me your command for
configuring PETSc?


Thanks

leejearl





On 2016年08月13日 01:10, Matthew Knepley wrote:

On Thu, Aug 11, 2016 at 8:00 PM, leejearl > wrote:

Thank you for your reply. I have attached the code, grid
and the error message.

cavity.c is the code file, cavity.exo is the grid, and
error.dat is the error message.

The command is "mpirun -n 2 ./cavity


Can you verify that you are running the master branch? I just
ran this and got

DM Object: 2 MPI processes
  type: plex
DM_0x8404_0 in 2 dimensions:
  0-cells: 5253 5252
  1-cells: 10352 10350
  2-cells: 5298 (198) 5297 (198)
Labels:
  ghost: 2 strata of sizes (199, 400)
  vtk: 1 strata of sizes (4901)
  Cell Sets: 1 strata of sizes (5100)
  Face Sets: 3 strata of sizes (53, 99, 50)
  depth: 3 strata of sizes (5253, 10352, 5298)

  Thanks,

 Matt

On 2016年08月11日 23:29, Matthew Knepley wrote:

On Thu, 

Re: [petsc-users] A question about DMPlexDistribute

2016-08-16 Thread Matthew Knepley
On Mon, Aug 15, 2016 at 9:28 PM, leejearl  wrote:

> Thank you for all your helps. I have tried and reinstalled PETSc lots of
> times. The error are still existed.
>
> I can not find out the reasons. Now, I give the some messages in this
> letter.
>
> 1> The source code is downloaded from the website of PETSc, and the
> version is 3.7.2.
>
> You will need to run in the 'master' branch, not the release since we have
fixed some bugs. It is best to
use master for very new features like this. The instructions are here:
http://www.mcs.anl.gov/petsc/developers/index.html

   Thanks,

  Matt

> 2> Configure:
>
> >export PETSC_DIR=./
>
> >export PETSC_ARCH=arch
>
> >./configure --prefix=$HOME/Install/petsc 
> >--with-mpi-dir=/home/leejearl/Install/mpich_3.1.4/gnu
> --download-exodusii=../externalpackages/exodus-5.24.tar.bz2
> --download-netcdf=../externalpackages/netcdf-4.3.2.tar.gz
> --download-hdf5=../externalpackages/hdf5-1.8.12.tar.gz
> --download-metis=../externalpackages/git.metis.tar.gz
> --download-parmetis=yes
>
> 3> The process of installation has no error.
>
> 4> After the installation, I added the following statement into the file
> ~/.bashrc:
>
> export PETSC_ARCH=""
>
> export PETSC_DIR=$HOME/Install/pets/
>
> I wish to get some helps as follows:
>
> 1> Is there any problems in my installation?
>
> 2> Can any one help me a simple code in which the value of overlap used in
> DMPlexDistribute function is greater than 1.
>
> 3> I attach the code, makefile, grid and the error messages again, I hope
> some one can help me to figure out the problems.
>
> 3.1> code: cavity.c
>
> 3.2> makefile: makefile
>
> 3.3> grid: cavity.exo
>
> 3.4> error messages: error.dat
>
> It is very strange that there is no error message when I run it using
> "mpirun -n 3 ./cavity", but when I run it using "mpirun -n 2 ./cavity", the
> errors happed.
>
> The error messages are shown in the file error.dat.
>
>
> Any helps are appreciated.
>
>
>
> On 2016年08月13日 09:04, leejearl wrote:
>
> Thank you for your reply. The source code I have used is from the website
> of PETSc, not from the git repository.
>
> I will have a test with the code from git repository.
>
>
> leejearl
>
> On 2016年08月13日 08:49, Oxberry, Geoffrey Malcolm wrote:
>
>
> On Aug 12, 2016, at 5:41 PM, leejearl  wrote:
>
> Hi, Matt:
>
>
> > Can you verify that you are running the master branch?
>
>
> cd ${PETSC_DIR}
> git branch
>
> The last command should return something like a list of branch names, and
> the branch name with an asterisk to the left of it will be the branch you
> are currently on.
>
> Geoff
>
> I am not sure, how can I verify this?
> And I configure PETSc with this command
> "./configure --prefix=$HOME/Install/petsc-openmpi
> --with-mpi=/home/leejearl/Install/openmpi/gnu/1.8.4
> --download-exodusii=yes --download-netcdf 
> --with-hdf5-dir=/home/leejearl/Install/hdf5-1.8.14
> --download-metis=yes".
> Is there some problem? Can you show me your command for configuring PETSc?
>
>
> Thanks
>
> leejearl
>
>
>
>
>
> On 2016年08月13日 01:10, Matthew Knepley wrote:
>
> On Thu, Aug 11, 2016 at 8:00 PM, leejearl  wrote:
>
>> Thank you for your reply. I have attached the code, grid and the error
>> message.
>>
>> cavity.c is the code file, cavity.exo is the grid, and error.dat is the
>> error message.
>>
>> The command is "mpirun -n 2 ./cavity
>>
>
> Can you verify that you are running the master branch? I just ran this and
> got
>
> DM Object: 2 MPI processes
>   type: plex
> DM_0x8404_0 in 2 dimensions:
>   0-cells: 5253 5252
>   1-cells: 10352 10350
>   2-cells: 5298 (198) 5297 (198)
> Labels:
>   ghost: 2 strata of sizes (199, 400)
>   vtk: 1 strata of sizes (4901)
>   Cell Sets: 1 strata of sizes (5100)
>   Face Sets: 3 strata of sizes (53, 99, 50)
>   depth: 3 strata of sizes (5253, 10352, 5298)
>
>   Thanks,
>
>  Matt
>
>
>> On 2016年08月11日 23:29, Matthew Knepley wrote:
>>
>> On Thu, Aug 11, 2016 at 3:14 AM, leejearl  wrote:
>>
>>> Hi,
>>> Thank you for your reply. It help me very much.
>>> But, for "/petsc-3.7.2/src/ts/examples/tutorials/ex11.c", when I
>>> set the overlap to 2 levels, the command is
>>> "mpirun -n 3 ./ex11 -f annulus-20.exo -ufv_mesh_overlap 2 -physics sw",
>>> it suffers a error.
>>> It seems to me that setting overlap to 2 is very common. Are there
>>> issues that I have not take into consideration?
>>> Any help are appreciated.
>>>
>> I will check this out. I have not tested an overlap of 2 here since I
>> generally use nearest neighbor FV methods for
>> unstructured stuff. I have test examples that run fine for overlap > 1.
>> Can you send the entire error message?
>>
>> If the error is not in the distribution, but rather in the analytics,
>> that is understandable because this example is only
>> intended to be run using a nearest neighbor FV method, and thus might be
>> confused if we give it two layers of ghost
>> 

Re: [petsc-users] A question about DMPlexDistribute

2016-08-12 Thread leejearl
Thank you for your reply. The source code I have used is from the 
website of PETSc, not from the git repository.


I will have a test with the code from git repository.


leejearl


On 2016年08月13日 08:49, Oxberry, Geoffrey Malcolm wrote:


On Aug 12, 2016, at 5:41 PM, leejearl > wrote:


Hi, Matt:



> Can you verify that you are running the master branch?


cd ${PETSC_DIR}
git branch

The last command should return something like a list of branch names, 
and the branch name with an asterisk to the left of it will be the 
branch you are currently on.


Geoff


I am not sure, how can I verify this?
And I configure PETSc with this command
"./configure --prefix=$HOME/Install/petsc-openmpi 
--with-mpi=/home/leejearl/Install/openmpi/gnu/1.8.4 
--download-exodusii=yes --download-netcdf 
--with-hdf5-dir=/home/leejearl/Install/hdf5-1.8.14 --download-metis=yes".
Is there some problem? Can you show me your command for configuring 
PETSc?



Thanks

leejearl





On 2016年08月13日 01:10, Matthew Knepley wrote:
On Thu, Aug 11, 2016 at 8:00 PM, leejearl > wrote:


Thank you for your reply. I have attached the code, grid and the
error message.

cavity.c is the code file, cavity.exo is the grid, and error.dat
is the error message.

The command is "mpirun -n 2 ./cavity


Can you verify that you are running the master branch? I just ran 
this and got


DM Object: 2 MPI processes
  type: plex
DM_0x8404_0 in 2 dimensions:
  0-cells: 5253 5252
  1-cells: 10352 10350
  2-cells: 5298 (198) 5297 (198)
Labels:
  ghost: 2 strata of sizes (199, 400)
  vtk: 1 strata of sizes (4901)
  Cell Sets: 1 strata of sizes (5100)
  Face Sets: 3 strata of sizes (53, 99, 50)
  depth: 3 strata of sizes (5253, 10352, 5298)

  Thanks,

 Matt

On 2016年08月11日 23:29, Matthew Knepley wrote:

On Thu, Aug 11, 2016 at 3:14 AM, leejearl > wrote:

Hi,
Thank you for your reply. It help me very much.
But, for
"/petsc-3.7.2/src/ts/examples/tutorials/ex11.c", when I set
the overlap to 2 levels, the command is
"mpirun -n 3 ./ex11 -f annulus-20.exo -ufv_mesh_overlap 2
-physics sw", it suffers a error.
It seems to me that setting overlap to 2 is very
common. Are there issues that I have not take into
consideration?
Any help are appreciated.

I will check this out. I have not tested an overlap of 2 here
since I generally use nearest neighbor FV methods for
unstructured stuff. I have test examples that run fine for
overlap > 1. Can you send the entire error message?

If the error is not in the distribution, but rather in the
analytics, that is understandable because this example is only
intended to be run using a nearest neighbor FV method, and thus
might be confused if we give it two layers of ghost
cells.

   Matt


leejearl


On 2016年08月11日 14:57, Julian Andrej wrote:

Hi,

take a look at slide 10 of [1], there is visually
explained what the overlap between partitions is.

[1]
https://www.archer.ac.uk/training/virtual/files/2015/06-PETSc/slides.pdf



On Thu, Aug 11, 2016 at 8:48 AM, leejearl
> wrote:

Hi, all:
I want to use PETSc to build my FVM code. Now, I
have a question about
the function DMPlexDistribute(DM dm, PetscInt overlap,
PetscSF *sf, DM *dmOverlap) .

In the example
"/petsc-3.7.2/src/ts/examples/tutorials/ex11.c". When
I set the overlap
as 0 or 1, it works well. But, if I set the overlap as
2, it suffers a problem.
I am confused about the value of overlap. Can it
be set as 2? What is the meaning of
the parameter overlap?
Any helps are appreciated!

leejearl









-- 
What most experimenters take for granted before they begin

their experiments is infinitely more interesting than any
results to which their experiments lead.
-- Norbert Wiener





--
What most experimenters take for granted before they begin their 
experiments is infinitely more interesting than any results to which 
their experiments lead.

-- Norbert Wiener






--
李季
西北工业大学航空学院流体力学系
Phone: 13324530085
QQ: 188524324



Re: [petsc-users] A question about DMPlexDistribute

2016-08-12 Thread Oxberry, Geoffrey Malcolm

On Aug 12, 2016, at 5:41 PM, leejearl 
> wrote:


Hi, Matt:



> Can you verify that you are running the master branch?

cd ${PETSC_DIR}
git branch

The last command should return something like a list of branch names, and the 
branch name with an asterisk to the left of it will be the branch you are 
currently on.

Geoff

I am not sure, how can I verify this?
And I configure PETSc with this command
"./configure --prefix=$HOME/Install/petsc-openmpi 
--with-mpi=/home/leejearl/Install/openmpi/gnu/1.8.4 --download-exodusii=yes 
--download-netcdf --with-hdf5-dir=/home/leejearl/Install/hdf5-1.8.14 
--download-metis=yes".
Is there some problem? Can you show me your command for configuring PETSc?


Thanks

leejearl





On 2016年08月13日 01:10, Matthew Knepley wrote:
On Thu, Aug 11, 2016 at 8:00 PM, leejearl 
> wrote:

Thank you for your reply. I have attached the code, grid and the error message.

cavity.c is the code file, cavity.exo is the grid, and error.dat is the error 
message.

The command is "mpirun -n 2 ./cavity

Can you verify that you are running the master branch? I just ran this and got

DM Object: 2 MPI processes
  type: plex
DM_0x8404_0 in 2 dimensions:
  0-cells: 5253 5252
  1-cells: 10352 10350
  2-cells: 5298 (198) 5297 (198)
Labels:
  ghost: 2 strata of sizes (199, 400)
  vtk: 1 strata of sizes (4901)
  Cell Sets: 1 strata of sizes (5100)
  Face Sets: 3 strata of sizes (53, 99, 50)
  depth: 3 strata of sizes (5253, 10352, 5298)

  Thanks,

 Matt

On 2016年08月11日 23:29, Matthew Knepley wrote:
On Thu, Aug 11, 2016 at 3:14 AM, leejearl 
> wrote:

Hi,
Thank you for your reply. It help me very much.
But, for "/petsc-3.7.2/src/ts/examples/tutorials/ex11.c", when I set the 
overlap to 2 levels, the command is
"mpirun -n 3 ./ex11 -f annulus-20.exo -ufv_mesh_overlap 2 -physics sw", it 
suffers a error.
It seems to me that setting overlap to 2 is very common. Are there issues 
that I have not take into consideration?
Any help are appreciated.

I will check this out. I have not tested an overlap of 2 here since I generally 
use nearest neighbor FV methods for
unstructured stuff. I have test examples that run fine for overlap > 1. Can you 
send the entire error message?

If the error is not in the distribution, but rather in the analytics, that is 
understandable because this example is only
intended to be run using a nearest neighbor FV method, and thus might be 
confused if we give it two layers of ghost
cells.

   Matt


leejearl

On 2016年08月11日 14:57, Julian Andrej wrote:
Hi,

take a look at slide 10 of [1], there is visually explained what the overlap 
between partitions is.

[1] https://www.archer.ac.uk/training/virtual/files/2015/06-PETSc/slides.pdf

On Thu, Aug 11, 2016 at 8:48 AM, leejearl 
> wrote:
Hi, all:
I want to use PETSc to build my FVM code. Now, I have a question about
the function  DMPlexDistribute(DM dm, PetscInt overlap, PetscSF *sf, DM 
*dmOverlap) .

In the example "/petsc-3.7.2/src/ts/examples/tutorials/ex11.c". When I set 
the overlap
as 0 or 1, it works well. But, if I set the overlap as 2, it suffers a problem.
I am confused about the value of overlap. Can it be set as 2? What is the 
meaning of
the parameter overlap?
Any helps are appreciated!

leejearl








--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener




--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener




Re: [petsc-users] A question about DMPlexDistribute

2016-08-12 Thread leejearl

Hi, Matt:



> Can you verify that you are running the master branch?
I am not sure, how can I verify this?
And I configure PETSc with this command
"./configure --prefix=$HOME/Install/petsc-openmpi 
--with-mpi=/home/leejearl/Install/openmpi/gnu/1.8.4 
--download-exodusii=yes --download-netcdf 
--with-hdf5-dir=/home/leejearl/Install/hdf5-1.8.14 --download-metis=yes".

Is there some problem? Can you show me your command for configuring PETSc?


Thanks

leejearl





On 2016年08月13日 01:10, Matthew Knepley wrote:
On Thu, Aug 11, 2016 at 8:00 PM, leejearl > wrote:


Thank you for your reply. I have attached the code, grid and the
error message.

cavity.c is the code file, cavity.exo is the grid, and error.dat
is the error message.

The command is "mpirun -n 2 ./cavity


Can you verify that you are running the master branch? I just ran this 
and got


DM Object: 2 MPI processes
  type: plex
DM_0x8404_0 in 2 dimensions:
  0-cells: 5253 5252
  1-cells: 10352 10350
  2-cells: 5298 (198) 5297 (198)
Labels:
  ghost: 2 strata of sizes (199, 400)
  vtk: 1 strata of sizes (4901)
  Cell Sets: 1 strata of sizes (5100)
  Face Sets: 3 strata of sizes (53, 99, 50)
  depth: 3 strata of sizes (5253, 10352, 5298)

  Thanks,

 Matt

On 2016年08月11日 23:29, Matthew Knepley wrote:

On Thu, Aug 11, 2016 at 3:14 AM, leejearl > wrote:

Hi,
Thank you for your reply. It help me very much.
But, for "/petsc-3.7.2/src/ts/examples/tutorials/ex11.c",
when I set the overlap to 2 levels, the command is
"mpirun -n 3 ./ex11 -f annulus-20.exo -ufv_mesh_overlap 2
-physics sw", it suffers a error.
It seems to me that setting overlap to 2 is very common.
Are there issues that I have not take into consideration?
Any help are appreciated.

I will check this out. I have not tested an overlap of 2 here
since I generally use nearest neighbor FV methods for
unstructured stuff. I have test examples that run fine for
overlap > 1. Can you send the entire error message?

If the error is not in the distribution, but rather in the
analytics, that is understandable because this example is only
intended to be run using a nearest neighbor FV method, and thus
might be confused if we give it two layers of ghost
cells.

   Matt


leejearl


On 2016年08月11日 14:57, Julian Andrej wrote:

Hi,

take a look at slide 10 of [1], there is visually explained
what the overlap between partitions is.

[1]
https://www.archer.ac.uk/training/virtual/files/2015/06-PETSc/slides.pdf



On Thu, Aug 11, 2016 at 8:48 AM, leejearl > wrote:

Hi, all:
I want to use PETSc to build my FVM code. Now, I
have a question about
the function DMPlexDistribute(DM dm, PetscInt overlap,
PetscSF *sf, DM *dmOverlap) .

In the example
"/petsc-3.7.2/src/ts/examples/tutorials/ex11.c". When I
set the overlap
as 0 or 1, it works well. But, if I set the overlap as
2, it suffers a problem.
I am confused about the value of overlap. Can it be
set as 2? What is the meaning of
the parameter overlap?
Any helps are appreciated!

leejearl









-- 
What most experimenters take for granted before they begin their

experiments is infinitely more interesting than any results to
which their experiments lead.
-- Norbert Wiener





--
What most experimenters take for granted before they begin their 
experiments is infinitely more interesting than any results to which 
their experiments lead.

-- Norbert Wiener




Re: [petsc-users] A question about DMPlexDistribute

2016-08-12 Thread Matthew Knepley
On Thu, Aug 11, 2016 at 8:00 PM, leejearl  wrote:

> Thank you for your reply. I have attached the code, grid and the error
> message.
>
> cavity.c is the code file, cavity.exo is the grid, and error.dat is the
> error message.
>
> The command is "mpirun -n 2 ./cavity
>

Can you verify that you are running the master branch? I just ran this and
got

DM Object: 2 MPI processes
  type: plex
DM_0x8404_0 in 2 dimensions:
  0-cells: 5253 5252
  1-cells: 10352 10350
  2-cells: 5298 (198) 5297 (198)
Labels:
  ghost: 2 strata of sizes (199, 400)
  vtk: 1 strata of sizes (4901)
  Cell Sets: 1 strata of sizes (5100)
  Face Sets: 3 strata of sizes (53, 99, 50)
  depth: 3 strata of sizes (5253, 10352, 5298)

  Thanks,

 Matt


> On 2016年08月11日 23:29, Matthew Knepley wrote:
>
> On Thu, Aug 11, 2016 at 3:14 AM, leejearl  wrote:
>
>> Hi,
>> Thank you for your reply. It help me very much.
>> But, for "/petsc-3.7.2/src/ts/examples/tutorials/ex11.c", when I set
>> the overlap to 2 levels, the command is
>> "mpirun -n 3 ./ex11 -f annulus-20.exo -ufv_mesh_overlap 2 -physics sw",
>> it suffers a error.
>> It seems to me that setting overlap to 2 is very common. Are there
>> issues that I have not take into consideration?
>> Any help are appreciated.
>>
> I will check this out. I have not tested an overlap of 2 here since I
> generally use nearest neighbor FV methods for
> unstructured stuff. I have test examples that run fine for overlap > 1.
> Can you send the entire error message?
>
> If the error is not in the distribution, but rather in the analytics, that
> is understandable because this example is only
> intended to be run using a nearest neighbor FV method, and thus might be
> confused if we give it two layers of ghost
> cells.
>
>Matt
>
>
>>
>> leejearl
>>
>> On 2016年08月11日 14:57, Julian Andrej wrote:
>>
>> Hi,
>>
>> take a look at slide 10 of [1], there is visually explained what the
>> overlap between partitions is.
>>
>> [1] https://www.archer.ac.uk/training/virtual/files/2015/06-
>> PETSc/slides.pdf
>>
>> On Thu, Aug 11, 2016 at 8:48 AM, leejearl  wrote:
>>
>>> Hi, all:
>>> I want to use PETSc to build my FVM code. Now, I have a question
>>> about
>>> the function  DMPlexDistribute(DM dm, PetscInt overlap, PetscSF *sf, DM
>>> *dmOverlap) .
>>>
>>> In the example "/petsc-3.7.2/src/ts/examples/tutorials/ex11.c".
>>> When I set the overlap
>>> as 0 or 1, it works well. But, if I set the overlap as 2, it suffers a
>>> problem.
>>> I am confused about the value of overlap. Can it be set as 2? What
>>> is the meaning of
>>> the parameter overlap?
>>> Any helps are appreciated!
>>>
>>> leejearl
>>>
>>>
>>>
>>>
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener


Re: [petsc-users] A question about DMPlexDistribute

2016-08-11 Thread leejearl

Hi,
Thank you for your reply. It help me very much.
But, for "/petsc-3.7.2/src/ts/examples/tutorials/ex11.c", when I 
set the overlap to 2 levels, the command is
"mpirun -n 3 ./ex11 -f annulus-20.exo -ufv_mesh_overlap 2 -physics sw", 
it suffers a error.
It seems to me that setting overlap to 2 is very common. Are there 
issues that I have not take into consideration?

Any help are appreciated.

leejearl


On 2016年08月11日 14:57, Julian Andrej wrote:

Hi,

take a look at slide 10 of [1], there is visually explained what the 
overlap between partitions is.


[1] 
https://www.archer.ac.uk/training/virtual/files/2015/06-PETSc/slides.pdf


On Thu, Aug 11, 2016 at 8:48 AM, leejearl > wrote:


Hi, all:
I want to use PETSc to build my FVM code. Now, I have a
question about
the function  DMPlexDistribute(DM dm, PetscInt overlap, PetscSF
*sf, DM *dmOverlap) .

In the example
"/petsc-3.7.2/src/ts/examples/tutorials/ex11.c". When I set the
overlap
as 0 or 1, it works well. But, if I set the overlap as 2, it
suffers a problem.
I am confused about the value of overlap. Can it be set as 2?
What is the meaning of
the parameter overlap?
Any helps are appreciated!

leejearl








Re: [petsc-users] A question about DMPlexDistribute

2016-08-11 Thread Julian Andrej
Hi,

take a look at slide 10 of [1], there is visually explained what the
overlap between partitions is.

[1] https://www.archer.ac.uk/training/virtual/files/2015/06-PETSc/slides.pdf

On Thu, Aug 11, 2016 at 8:48 AM, leejearl  wrote:

> Hi, all:
> I want to use PETSc to build my FVM code. Now, I have a question about
> the function  DMPlexDistribute(DM dm, PetscInt overlap, PetscSF *sf, DM
> *dmOverlap) .
>
> In the example "/petsc-3.7.2/src/ts/examples/tutorials/ex11.c". When
> I set the overlap
> as 0 or 1, it works well. But, if I set the overlap as 2, it suffers a
> problem.
> I am confused about the value of overlap. Can it be set as 2? What is
> the meaning of
> the parameter overlap?
> Any helps are appreciated!
>
> leejearl
>
>
>
>


[petsc-users] A question about DMPlexDistribute

2016-08-11 Thread leejearl

Hi, all:
I want to use PETSc to build my FVM code. Now, I have a question about
the function  DMPlexDistribute(DM dm, PetscInt overlap, PetscSF *sf, DM 
*dmOverlap) .


In the example "/petsc-3.7.2/src/ts/examples/tutorials/ex11.c". 
When I set the overlap
as 0 or 1, it works well. But, if I set the overlap as 2, it suffers a 
problem.
I am confused about the value of overlap. Can it be set as 2? What 
is the meaning of

the parameter overlap?
Any helps are appreciated!

leejearl