Re: [petsc-users] DMPlex problem

2016-09-26 Thread Stefano Zampini
Morten, 

have a look at the local-to-global mapping that is created for the MATIS case 
in DMCreateMatrix_Plex in src/dm/impls/plex/plex.c.
and how DMPlexMatSetClosure is used in src/snes/utils/dmplexsnes.c in the MATIS 
case.

MATIS is a format for non-overlapping domain decomposition when you assemble 
your stiffness matrix on each subdomain (MPI proc) separately.

On Sep 26, 2016, at 5:04 PM, Matthew Knepley <knep...@gmail.com> wrote:

> On Mon, Sep 26, 2016 at 8:41 AM, Morten Nobel-Jørgensen <m...@dtu.dk> wrote:
> Hi Matt
> 
> We are trying to do a simple FE using DMPlex, but when assemble the global 
> stiffness matrix we get in problems when running NP>1 - that is the global 
> matrix differs when we move to a distributed system, where it should not.
> 
> In pseudo-code our CreateGlobalStiffnessMatrix does the following
> create a local stiffness matrix ke with some values
> for each local cell/element e (using result from 
> DMPlexGetHeightStratum(..,0,..)
> for each of its vertices (using DMPlexGetTransitiveClosure(..,e,..)
> set local/global mapping to edof
> update the global stiffness matrix K using the local mapping edof and values 
> ke
> 
> The code we have sent is a simplified version, which just builds a dummy 
> stiffness matrix - but we believe this matrix should still the same 
> independent of NP. (That is why we use trace).
> 
> I am not sure what is wrong there, but there are a bunch of Plex FEM 
> examples, like SNES ex12, ex62, ex77.
>  
> I'm not familiar with MatSetValuesClosure(). Is that the missing piece?
> 
> I use this to do indexing since it is so error prone to do it by yourself. I 
> think this could be the problem with your example.
> 
> When you distribute the mesh, ordering changes, and we do permutations to 
> keep all local indices contiguous. The
> MatSetValuesClosure() is a convenience function for FEM which takes in a mesh 
> point (could be cell, face, etc.) and
> translates that point number to a set of row indices using
> 
>   a) the transitive closure of that point in the mesh DAG
> 
> and
> 
>   b) the default PetscSection for the DM which maps mesh points to sets of 
> dofs (row indices)
> 
> For it to work, you need to setup the default DM section, which it looks like 
> you have done.
> 
>   Thanks,
> 
> Matt
>  
> Kind regards,
> Morten
> 
> 
> From: Matthew Knepley [knep...@gmail.com]
> Sent: Monday, September 26, 2016 2:19 PM
> To: Morten Nobel-Jørgensen
> Cc: PETSc ‎[petsc-users@mcs.anl.gov]‎
> Subject: Re: [petsc-users] DMPlex problem
> 
> On Mon, Sep 26, 2016 at 7:00 AM, Morten Nobel-Jørgensen <m...@dtu.dk> wrote:
> Hi Matthew
> 
> It seems like the problem is not fully fixed. I have changed the code to now 
> run on with both 2,3 and 4 cells. When I run the code using NP = 1..3 I get 
> different result both for NP=1 to NP=2/3 when cell count is larger than 2.
> 
> Do you mean the trace? I have no idea what you are actually putting in.
> 
> I have a lot of debugging when you use, MatSetValuesClosure(), but when you 
> directly use
> MatSetValuesLocal(), you are handling things yourself.
> 
>   Thanks,
> 
>  Matt
>  
> Kind regards,
> Morten
> 
> mpiexec -np 1 ./ex18k
> cells 2
> Loc size: 36
> Trace of matrix: 132.00
> cells 3
> Loc size: 48
> Trace of matrix: 192.00
> cells 4
> Loc size: 60
> Trace of matrix: 258.00
> mpiexec -np 2 ./ex18k
> cells 2
> Loc size: 24
> Loc size: 24
> Trace of matrix: 132.00
> cells 3
> Loc size: 36
> Loc size: 24
> Trace of matrix: 198.00
> cells 4
> Loc size: 36
> Loc size: 36
> Trace of matrix: 264.00
> mpiexec -np 3 ./ex18k
> cells 2
> Loc size: 24
> Loc size: 24
> Loc size: 0
> Trace of matrix: 132.00
> cells 3
> Loc size: 24
> Loc size: 24
> Loc size: 24
> Trace of matrix: 198.00
> cells 4
> Loc size: 36
> Loc size: 24
> Loc size: 24
> Trace of matrix: 264.00
> 
> 
> 
> 
> From: petsc-users-boun...@mcs.anl.gov [petsc-users-boun...@mcs.anl.gov] on 
> behalf of Morten Nobel-Jørgensen [m...@dtu.dk]
> Sent: Sunday, September 25, 2016 11:15 AM
> To: Matthew Knepley
> Cc: PETSc ‎[petsc-users@mcs.anl.gov]‎
> Subject: Re: [petsc-users] DMPlex problem
> 
> Hi Matthew
> 
> Thank you for the bug-fix :) I can confirm that it works :)
> 
> And thanks for your hard work on PETSc - your work is very much appreciated!
> 
> Kind regards,
> Morten
> From: Matthew Knepley [knep...@gmail.com]
> Sent: Friday, September 23, 2016 2:46 PM
> To: Morten Nobel-Jørgensen
> Cc: PETSc ‎[petsc-users@mcs.anl.gov]‎
> Subject: Re: [petsc-u

Re: [petsc-users] DMPlex problem

2016-09-26 Thread Matthew Knepley
On Mon, Sep 26, 2016 at 8:41 AM, Morten Nobel-Jørgensen <m...@dtu.dk> wrote:

> Hi Matt
>
> We are trying to do a simple FE using DMPlex, but when assemble the global
> stiffness matrix we get in problems when running NP>1 - that is the global
> matrix differs when we move to a distributed system, where it should not.
>
> In pseudo-code our CreateGlobalStiffnessMatrix does the following
>
> create a local stiffness matrix ke with some values
> for each local cell/element e (using result from
> DMPlexGetHeightStratum(..,0,..)
> for each of its vertices (using DMPlexGetTransitiveClosure(..,e,..)
> set local/global mapping to edof
> update the global stiffness matrix K using the local mapping edof and
> values ke
>
>
> The code we have sent is a simplified version, which just builds a dummy
> stiffness matrix - but we believe this matrix should still the same
> independent of NP. (That is why we use trace).
>

I am not sure what is wrong there, but there are a bunch of Plex FEM
examples, like SNES ex12, ex62, ex77.


> I'm not familiar with MatSetValuesClosure(). Is that the missing piece?
>

I use this to do indexing since it is so error prone to do it by yourself.
I think this could be the problem with your example.

When you distribute the mesh, ordering changes, and we do permutations to
keep all local indices contiguous. The
MatSetValuesClosure() is a convenience function for FEM which takes in a
mesh point (could be cell, face, etc.) and
translates that point number to a set of row indices using

  a) the transitive closure of that point in the mesh DAG

and

  b) the default PetscSection for the DM which maps mesh points to sets of
dofs (row indices)

For it to work, you need to setup the default DM section, which it looks
like you have done.

  Thanks,

Matt


> Kind regards,
> Morten
>
>
> --
> *From:* Matthew Knepley [knep...@gmail.com]
> *Sent:* Monday, September 26, 2016 2:19 PM
> *To:* Morten Nobel-Jørgensen
> *Cc:* PETSc ‎[petsc-users@mcs.anl.gov]‎
> *Subject:* Re: [petsc-users] DMPlex problem
>
> On Mon, Sep 26, 2016 at 7:00 AM, Morten Nobel-Jørgensen <m...@dtu.dk>
> wrote:
>
>> Hi Matthew
>>
>> It seems like the problem is not fully fixed. I have changed the code to
>> now run on with both 2,3 and 4 cells. When I run the code using NP = 1..3 I
>> get different result both for NP=1 to NP=2/3 when cell count is larger than
>> 2.
>>
>
> Do you mean the trace? I have no idea what you are actually putting in.
>
> I have a lot of debugging when you use, MatSetValuesClosure(), but when
> you directly use
> MatSetValuesLocal(), you are handling things yourself.
>
>   Thanks,
>
>  Matt
>
>
>> Kind regards,
>> Morten
>> 
>> mpiexec -np 1 ./ex18k
>> cells 2
>> Loc size: 36
>> Trace of matrix: 132.00
>> cells 3
>> Loc size: 48
>> Trace of matrix: 192.00
>> cells 4
>> Loc size: 60
>> Trace of matrix: 258.00
>> mpiexec -np 2 ./ex18k
>> cells 2
>> Loc size: 24
>> Loc size: 24
>> Trace of matrix: 132.00
>> cells 3
>> Loc size: 36
>> Loc size: 24
>> Trace of matrix: 198.00
>> cells 4
>> Loc size: 36
>> Loc size: 36
>> Trace of matrix: 264.00
>> mpiexec -np 3 ./ex18k
>> cells 2
>> Loc size: 24
>> Loc size: 24
>> Loc size: 0
>> Trace of matrix: 132.00
>> cells 3
>> Loc size: 24
>> Loc size: 24
>> Loc size: 24
>> Trace of matrix: 198.00
>> cells 4
>> Loc size: 36
>> Loc size: 24
>> Loc size: 24
>> Trace of matrix: 264.00
>>
>>
>>
>>
>> --
>> *From:* petsc-users-boun...@mcs.anl.gov [petsc-users-boun...@mcs.anl.gov]
>> on behalf of Morten Nobel-Jørgensen [m...@dtu.dk]
>> *Sent:* Sunday, September 25, 2016 11:15 AM
>> *To:* Matthew Knepley
>> *Cc:* PETSc ‎[petsc-users@mcs.anl.gov]‎
>> *Subject:* Re: [petsc-users] DMPlex problem
>>
>> Hi Matthew
>>
>> Thank you for the bug-fix :) I can confirm that it works :)
>>
>> And thanks for your hard work on PETSc - your work is very much
>> appreciated!
>>
>> Kind regards,
>> Morten
>> --
>> *From:* Matthew Knepley [knep...@gmail.com]
>> *Sent:* Friday, September 23, 2016 2:46 PM
>> *To:* Morten Nobel-Jørgensen
>> *Cc:* PETSc ‎[petsc-users@mcs.anl.gov]‎
>> *Subject:* Re: [petsc-users] DMPlex problem
>>
>> On Fri, Sep 23, 2016 at 7:45 AM, Matthew Knepley <knep...@gmail.com>
>> wrote:
>>

Re: [petsc-users] DMPlex problem

2016-09-26 Thread Morten Nobel-Jørgensen
Hi Matt

We are trying to do a simple FE using DMPlex, but when assemble the global 
stiffness matrix we get in problems when running NP>1 - that is the global 
matrix differs when we move to a distributed system, where it should not.

In pseudo-code our CreateGlobalStiffnessMatrix does the following
create a local stiffness matrix ke with some values
for each local cell/element e (using result from DMPlexGetHeightStratum(..,0,..)
for each of its vertices (using DMPlexGetTransitiveClosure(..,e,..)
set local/global mapping to edof
update the global stiffness matrix K using the local mapping edof and values ke

The code we have sent is a simplified version, which just builds a dummy 
stiffness matrix - but we believe this matrix should still the same independent 
of NP. (That is why we use trace).

I'm not familiar with MatSetValuesClosure(). Is that the missing piece?

Kind regards,
Morten



From: Matthew Knepley [knep...@gmail.com]
Sent: Monday, September 26, 2016 2:19 PM
To: Morten Nobel-Jørgensen
Cc: PETSc ‎[petsc-users@mcs.anl.gov]‎
Subject: Re: [petsc-users] DMPlex problem

On Mon, Sep 26, 2016 at 7:00 AM, Morten Nobel-Jørgensen 
<m...@dtu.dk<mailto:m...@dtu.dk>> wrote:
Hi Matthew

It seems like the problem is not fully fixed. I have changed the code to now 
run on with both 2,3 and 4 cells. When I run the code using NP = 1..3 I get 
different result both for NP=1 to NP=2/3 when cell count is larger than 2.

Do you mean the trace? I have no idea what you are actually putting in.

I have a lot of debugging when you use, MatSetValuesClosure(), but when you 
directly use
MatSetValuesLocal(), you are handling things yourself.

  Thanks,

 Matt

Kind regards,
Morten

mpiexec -np 1 ./ex18k
cells 2
Loc size: 36
Trace of matrix: 132.00
cells 3
Loc size: 48
Trace of matrix: 192.00
cells 4
Loc size: 60
Trace of matrix: 258.00
mpiexec -np 2 ./ex18k
cells 2
Loc size: 24
Loc size: 24
Trace of matrix: 132.00
cells 3
Loc size: 36
Loc size: 24
Trace of matrix: 198.00
cells 4
Loc size: 36
Loc size: 36
Trace of matrix: 264.00
mpiexec -np 3 ./ex18k
cells 2
Loc size: 24
Loc size: 24
Loc size: 0
Trace of matrix: 132.00
cells 3
Loc size: 24
Loc size: 24
Loc size: 24
Trace of matrix: 198.00
cells 4
Loc size: 36
Loc size: 24
Loc size: 24
Trace of matrix: 264.00





From: petsc-users-boun...@mcs.anl.gov<mailto:petsc-users-boun...@mcs.anl.gov> 
[petsc-users-boun...@mcs.anl.gov<mailto:petsc-users-boun...@mcs.anl.gov>] on 
behalf of Morten Nobel-Jørgensen [m...@dtu.dk<mailto:m...@dtu.dk>]
Sent: Sunday, September 25, 2016 11:15 AM
To: Matthew Knepley
Cc: PETSc ‎[petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov>]‎
Subject: Re: [petsc-users] DMPlex problem

Hi Matthew

Thank you for the bug-fix :) I can confirm that it works :)

And thanks for your hard work on PETSc - your work is very much appreciated!

Kind regards,
Morten

From: Matthew Knepley [knep...@gmail.com<mailto:knep...@gmail.com>]
Sent: Friday, September 23, 2016 2:46 PM
To: Morten Nobel-Jørgensen
Cc: PETSc ‎[petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov>]‎
Subject: Re: [petsc-users] DMPlex problem

On Fri, Sep 23, 2016 at 7:45 AM, Matthew Knepley 
<knep...@gmail.com<mailto:knep...@gmail.com>> wrote:
On Fri, Sep 23, 2016 at 3:48 AM, Morten Nobel-Jørgensen 
<m...@dtu.dk<mailto:m...@dtu.dk>> wrote:
Dear PETSc developers

Any update on this issue regarding DMPlex? Or is there any obvious workaround 
that we are unaware of?

I have fixed this bug. It did not come up in nightly tests because we are not 
using MatSetValuesLocal(). Instead we
use MatSetValuesClosure() which translates differently.

Here is the branch

  https://bitbucket.org/petsc/petsc/branch/knepley/fix-dm-ltog-bs

and I have merged it to next. It will go to master in a day or two.

Also, here is the cleaned up source with no memory leaks.

  Matt

Also should we additionally register the issue on Bitbucket or is reporting the 
issue on the mailing list enough?

Normally we are faster, but the start of the semester was hard this year.

  Thanks,

 Matt

Kind regards,
Morten


From: Matthew Knepley [knep...@gmail.com<mailto:knep...@gmail.com>]
Sent: Friday, September 09, 2016 12:21 PM
To: Morten Nobel-Jørgensen
Cc: PETSc ‎[petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov>]‎
Subject: Re: [petsc-users] DMPlex problem

On Fri, Sep 9, 2016 at 4:04 AM, Morten Nobel-Jørgensen 
<m...@dtu.dk<mailto:m...@dtu.dk>> wrote:
Dear PETSc developers and users,

Last week we posted a question regarding an error with DMPlex and multiple dofs 
and have not gotten any feedback yet. This is uncharted waters for us, since we 
have gotten used to an extremely fast feedback from the PETSc crew. So - with 
the chance of soun

Re: [petsc-users] DMPlex problem

2016-09-26 Thread Matthew Knepley
On Mon, Sep 26, 2016 at 7:00 AM, Morten Nobel-Jørgensen <m...@dtu.dk> wrote:

> Hi Matthew
>
> It seems like the problem is not fully fixed. I have changed the code to
> now run on with both 2,3 and 4 cells. When I run the code using NP = 1..3 I
> get different result both for NP=1 to NP=2/3 when cell count is larger than
> 2.
>

Do you mean the trace? I have no idea what you are actually putting in.

I have a lot of debugging when you use, MatSetValuesClosure(), but when you
directly use
MatSetValuesLocal(), you are handling things yourself.

  Thanks,

 Matt


> Kind regards,
> Morten
> 
> mpiexec -np 1 ./ex18k
> cells 2
> Loc size: 36
> Trace of matrix: 132.00
> cells 3
> Loc size: 48
> Trace of matrix: 192.00
> cells 4
> Loc size: 60
> Trace of matrix: 258.00
> mpiexec -np 2 ./ex18k
> cells 2
> Loc size: 24
> Loc size: 24
> Trace of matrix: 132.00
> cells 3
> Loc size: 36
> Loc size: 24
> Trace of matrix: 198.00
> cells 4
> Loc size: 36
> Loc size: 36
> Trace of matrix: 264.00
> mpiexec -np 3 ./ex18k
> cells 2
> Loc size: 24
> Loc size: 24
> Loc size: 0
> Trace of matrix: 132.00
> cells 3
> Loc size: 24
> Loc size: 24
> Loc size: 24
> Trace of matrix: 198.00
> cells 4
> Loc size: 36
> Loc size: 24
> Loc size: 24
> Trace of matrix: 264.00
>
>
>
>
> --
> *From:* petsc-users-boun...@mcs.anl.gov [petsc-users-boun...@mcs.anl.gov]
> on behalf of Morten Nobel-Jørgensen [m...@dtu.dk]
> *Sent:* Sunday, September 25, 2016 11:15 AM
> *To:* Matthew Knepley
> *Cc:* PETSc ‎[petsc-users@mcs.anl.gov]‎
> *Subject:* Re: [petsc-users] DMPlex problem
>
> Hi Matthew
>
> Thank you for the bug-fix :) I can confirm that it works :)
>
> And thanks for your hard work on PETSc - your work is very much
> appreciated!
>
> Kind regards,
> Morten
> ----------
> *From:* Matthew Knepley [knep...@gmail.com]
> *Sent:* Friday, September 23, 2016 2:46 PM
> *To:* Morten Nobel-Jørgensen
> *Cc:* PETSc ‎[petsc-users@mcs.anl.gov]‎
> *Subject:* Re: [petsc-users] DMPlex problem
>
> On Fri, Sep 23, 2016 at 7:45 AM, Matthew Knepley <knep...@gmail.com>
> wrote:
>
>> On Fri, Sep 23, 2016 at 3:48 AM, Morten Nobel-Jørgensen <m...@dtu.dk>
>> wrote:
>>
>>> Dear PETSc developers
>>>
>>> Any update on this issue regarding DMPlex? Or is there any obvious
>>> workaround that we are unaware of?
>>>
>>
>> I have fixed this bug. It did not come up in nightly tests because we are
>> not using MatSetValuesLocal(). Instead we
>> use MatSetValuesClosure() which translates differently.
>>
>> Here is the branch
>>
>>   https://bitbucket.org/petsc/petsc/branch/knepley/fix-dm-ltog-bs
>>
>> and I have merged it to next. It will go to master in a day or two.
>>
>
> Also, here is the cleaned up source with no memory leaks.
>
>   Matt
>
>
>> Also should we additionally register the issue on Bitbucket or is
>>> reporting the issue on the mailing list enough?
>>>
>>
>> Normally we are faster, but the start of the semester was hard this year.
>>
>>   Thanks,
>>
>>  Matt
>>
>>
>>> Kind regards,
>>> Morten
>>>
>>> --
>>> *From:* Matthew Knepley [knep...@gmail.com]
>>> *Sent:* Friday, September 09, 2016 12:21 PM
>>> *To:* Morten Nobel-Jørgensen
>>> *Cc:* PETSc ‎[petsc-users@mcs.anl.gov]‎
>>> *Subject:* Re: [petsc-users] DMPlex problem
>>>
>>> On Fri, Sep 9, 2016 at 4:04 AM, Morten Nobel-Jørgensen <m...@dtu.dk>
>>> wrote:
>>>
>>>> Dear PETSc developers and users,
>>>>
>>>> Last week we posted a question regarding an error with DMPlex and
>>>> multiple dofs and have not gotten any feedback yet. This is uncharted
>>>> waters for us, since we have gotten used to an extremely fast feedback from
>>>> the PETSc crew. So - with the chance of sounding impatient and ungrateful -
>>>> we would like to hear if anybody has any ideas that could point us in the
>>>> right direction?
>>>>
>>>
>>> This is my fault. You have not gotten a response because everyone else
>>> was waiting for me, and I have been
>>> slow because I just moved houses at the same time as term started here.
>>> Sorry about that.
>>>
>>> The example ran for me and I saw your problem. The local-tp-global map
>>> is missing for some

Re: [petsc-users] DMPlex problem

2016-09-26 Thread Morten Nobel-Jørgensen
Hi Matthew

It seems like the problem is not fully fixed. I have changed the code to now 
run on with both 2,3 and 4 cells. When I run the code using NP = 1..3 I get 
different result both for NP=1 to NP=2/3 when cell count is larger than 2.

Kind regards,
Morten

mpiexec -np 1 ./ex18k
cells 2
Loc size: 36
Trace of matrix: 132.00
cells 3
Loc size: 48
Trace of matrix: 192.00
cells 4
Loc size: 60
Trace of matrix: 258.00
mpiexec -np 2 ./ex18k
cells 2
Loc size: 24
Loc size: 24
Trace of matrix: 132.00
cells 3
Loc size: 36
Loc size: 24
Trace of matrix: 198.00
cells 4
Loc size: 36
Loc size: 36
Trace of matrix: 264.00
mpiexec -np 3 ./ex18k
cells 2
Loc size: 24
Loc size: 24
Loc size: 0
Trace of matrix: 132.00
cells 3
Loc size: 24
Loc size: 24
Loc size: 24
Trace of matrix: 198.00
cells 4
Loc size: 36
Loc size: 24
Loc size: 24
Trace of matrix: 264.00





From: petsc-users-boun...@mcs.anl.gov [petsc-users-boun...@mcs.anl.gov] on 
behalf of Morten Nobel-Jørgensen [m...@dtu.dk]
Sent: Sunday, September 25, 2016 11:15 AM
To: Matthew Knepley
Cc: PETSc ‎[petsc-users@mcs.anl.gov]‎
Subject: Re: [petsc-users] DMPlex problem

Hi Matthew

Thank you for the bug-fix :) I can confirm that it works :)

And thanks for your hard work on PETSc - your work is very much appreciated!

Kind regards,
Morten

From: Matthew Knepley [knep...@gmail.com]
Sent: Friday, September 23, 2016 2:46 PM
To: Morten Nobel-Jørgensen
Cc: PETSc ‎[petsc-users@mcs.anl.gov]‎
Subject: Re: [petsc-users] DMPlex problem

On Fri, Sep 23, 2016 at 7:45 AM, Matthew Knepley 
<knep...@gmail.com<mailto:knep...@gmail.com>> wrote:
On Fri, Sep 23, 2016 at 3:48 AM, Morten Nobel-Jørgensen 
<m...@dtu.dk<mailto:m...@dtu.dk>> wrote:
Dear PETSc developers

Any update on this issue regarding DMPlex? Or is there any obvious workaround 
that we are unaware of?

I have fixed this bug. It did not come up in nightly tests because we are not 
using MatSetValuesLocal(). Instead we
use MatSetValuesClosure() which translates differently.

Here is the branch

  https://bitbucket.org/petsc/petsc/branch/knepley/fix-dm-ltog-bs

and I have merged it to next. It will go to master in a day or two.

Also, here is the cleaned up source with no memory leaks.

  Matt

Also should we additionally register the issue on Bitbucket or is reporting the 
issue on the mailing list enough?

Normally we are faster, but the start of the semester was hard this year.

  Thanks,

 Matt

Kind regards,
Morten


From: Matthew Knepley [knep...@gmail.com<mailto:knep...@gmail.com>]
Sent: Friday, September 09, 2016 12:21 PM
To: Morten Nobel-Jørgensen
Cc: PETSc ‎[petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov>]‎
Subject: Re: [petsc-users] DMPlex problem

On Fri, Sep 9, 2016 at 4:04 AM, Morten Nobel-Jørgensen 
<m...@dtu.dk<mailto:m...@dtu.dk>> wrote:
Dear PETSc developers and users,

Last week we posted a question regarding an error with DMPlex and multiple dofs 
and have not gotten any feedback yet. This is uncharted waters for us, since we 
have gotten used to an extremely fast feedback from the PETSc crew. So - with 
the chance of sounding impatient and ungrateful - we would like to hear if 
anybody has any ideas that could point us in the right direction?

This is my fault. You have not gotten a response because everyone else was 
waiting for me, and I have been
slow because I just moved houses at the same time as term started here. Sorry 
about that.

The example ran for me and I saw your problem. The local-tp-global map is 
missing for some reason.
I am tracking it down now. It should be made by DMCreateMatrix(), so this is 
mysterious. I hope to have
this fixed by early next week.

  Thanks,

Matt

We have created a small example problem that demonstrates the error in the 
matrix assembly.

Thanks,
Morten





--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener



--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener



--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
static char help[] = "";

#include 

const int dof = 3;

#undef __FUNCT__
#define __FUNCT__ "SetupDOFs"

/* Assign dofs (3 for each node) */
PetscErrorCode SetupDOFs(DM dm) {
PetscSection s;
PetscInt pStart, pEnd, cStart, cEnd, c, fStart, fEnd, f, vStart, vEnd, v, eStart, eEnd, e;
PetscErrorCode ierr;

PetscFunctionBeginUser;
ierr = PetscSectionCreate(PetscObjectComm((PetscObject) dm), ); CHKERRQ(ierr);
ierr = PetscSectionSetNumFiel

Re: [petsc-users] DMPlex problem

2016-09-25 Thread Morten Nobel-Jørgensen
Hi Matthew

Thank you for the bug-fix :) I can confirm that it works :)

And thanks for your hard work on PETSc - your work is very much appreciated!

Kind regards,
Morten

From: Matthew Knepley [knep...@gmail.com]
Sent: Friday, September 23, 2016 2:46 PM
To: Morten Nobel-Jørgensen
Cc: PETSc ‎[petsc-users@mcs.anl.gov]‎
Subject: Re: [petsc-users] DMPlex problem

On Fri, Sep 23, 2016 at 7:45 AM, Matthew Knepley 
<knep...@gmail.com<mailto:knep...@gmail.com>> wrote:
On Fri, Sep 23, 2016 at 3:48 AM, Morten Nobel-Jørgensen 
<m...@dtu.dk<mailto:m...@dtu.dk>> wrote:
Dear PETSc developers

Any update on this issue regarding DMPlex? Or is there any obvious workaround 
that we are unaware of?

I have fixed this bug. It did not come up in nightly tests because we are not 
using MatSetValuesLocal(). Instead we
use MatSetValuesClosure() which translates differently.

Here is the branch

  https://bitbucket.org/petsc/petsc/branch/knepley/fix-dm-ltog-bs

and I have merged it to next. It will go to master in a day or two.

Also, here is the cleaned up source with no memory leaks.

  Matt

Also should we additionally register the issue on Bitbucket or is reporting the 
issue on the mailing list enough?

Normally we are faster, but the start of the semester was hard this year.

  Thanks,

 Matt

Kind regards,
Morten


From: Matthew Knepley [knep...@gmail.com<mailto:knep...@gmail.com>]
Sent: Friday, September 09, 2016 12:21 PM
To: Morten Nobel-Jørgensen
Cc: PETSc ‎[petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov>]‎
Subject: Re: [petsc-users] DMPlex problem

On Fri, Sep 9, 2016 at 4:04 AM, Morten Nobel-Jørgensen 
<m...@dtu.dk<mailto:m...@dtu.dk>> wrote:
Dear PETSc developers and users,

Last week we posted a question regarding an error with DMPlex and multiple dofs 
and have not gotten any feedback yet. This is uncharted waters for us, since we 
have gotten used to an extremely fast feedback from the PETSc crew. So - with 
the chance of sounding impatient and ungrateful - we would like to hear if 
anybody has any ideas that could point us in the right direction?

This is my fault. You have not gotten a response because everyone else was 
waiting for me, and I have been
slow because I just moved houses at the same time as term started here. Sorry 
about that.

The example ran for me and I saw your problem. The local-tp-global map is 
missing for some reason.
I am tracking it down now. It should be made by DMCreateMatrix(), so this is 
mysterious. I hope to have
this fixed by early next week.

  Thanks,

Matt

We have created a small example problem that demonstrates the error in the 
matrix assembly.

Thanks,
Morten





--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener



--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener



--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener


Re: [petsc-users] DMPlex problem

2016-09-23 Thread Matthew Knepley
On Fri, Sep 23, 2016 at 7:45 AM, Matthew Knepley <knep...@gmail.com> wrote:

> On Fri, Sep 23, 2016 at 3:48 AM, Morten Nobel-Jørgensen <m...@dtu.dk>
> wrote:
>
>> Dear PETSc developers
>>
>> Any update on this issue regarding DMPlex? Or is there any obvious
>> workaround that we are unaware of?
>>
>
> I have fixed this bug. It did not come up in nightly tests because we are
> not using MatSetValuesLocal(). Instead we
> use MatSetValuesClosure() which translates differently.
>
> Here is the branch
>
>   https://bitbucket.org/petsc/petsc/branch/knepley/fix-dm-ltog-bs
>
> and I have merged it to next. It will go to master in a day or two.
>

Also, here is the cleaned up source with no memory leaks.

  Matt


> Also should we additionally register the issue on Bitbucket or is
>> reporting the issue on the mailing list enough?
>>
>
> Normally we are faster, but the start of the semester was hard this year.
>
>   Thanks,
>
>  Matt
>
>
>> Kind regards,
>> Morten
>>
>> --
>> *From:* Matthew Knepley [knep...@gmail.com]
>> *Sent:* Friday, September 09, 2016 12:21 PM
>> *To:* Morten Nobel-Jørgensen
>> *Cc:* PETSc ‎[petsc-users@mcs.anl.gov]‎
>> *Subject:* Re: [petsc-users] DMPlex problem
>>
>> On Fri, Sep 9, 2016 at 4:04 AM, Morten Nobel-Jørgensen <m...@dtu.dk>
>> wrote:
>>
>>> Dear PETSc developers and users,
>>>
>>> Last week we posted a question regarding an error with DMPlex and
>>> multiple dofs and have not gotten any feedback yet. This is uncharted
>>> waters for us, since we have gotten used to an extremely fast feedback from
>>> the PETSc crew. So - with the chance of sounding impatient and ungrateful -
>>> we would like to hear if anybody has any ideas that could point us in the
>>> right direction?
>>>
>>
>> This is my fault. You have not gotten a response because everyone else
>> was waiting for me, and I have been
>> slow because I just moved houses at the same time as term started here.
>> Sorry about that.
>>
>> The example ran for me and I saw your problem. The local-tp-global map is
>> missing for some reason.
>> I am tracking it down now. It should be made by DMCreateMatrix(), so this
>> is mysterious. I hope to have
>> this fixed by early next week.
>>
>>   Thanks,
>>
>> Matt
>>
>>
>>> We have created a small example problem that demonstrates the error in
>>> the matrix assembly.
>>>
>>> Thanks,
>>> Morten
>>>
>>>
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener


ex18.c
Description: Binary data


Re: [petsc-users] DMPlex problem

2016-09-23 Thread Matthew Knepley
On Fri, Sep 23, 2016 at 3:48 AM, Morten Nobel-Jørgensen <m...@dtu.dk> wrote:

> Dear PETSc developers
>
> Any update on this issue regarding DMPlex? Or is there any obvious
> workaround that we are unaware of?
>

I have fixed this bug. It did not come up in nightly tests because we are
not using MatSetValuesLocal(). Instead we
use MatSetValuesClosure() which translates differently.

Here is the branch

  https://bitbucket.org/petsc/petsc/branch/knepley/fix-dm-ltog-bs

and I have merged it to next. It will go to master in a day or two.


> Also should we additionally register the issue on Bitbucket or is
> reporting the issue on the mailing list enough?
>

Normally we are faster, but the start of the semester was hard this year.

  Thanks,

 Matt


> Kind regards,
> Morten
>
> --
> *From:* Matthew Knepley [knep...@gmail.com]
> *Sent:* Friday, September 09, 2016 12:21 PM
> *To:* Morten Nobel-Jørgensen
> *Cc:* PETSc ‎[petsc-users@mcs.anl.gov]‎
> *Subject:* Re: [petsc-users] DMPlex problem
>
> On Fri, Sep 9, 2016 at 4:04 AM, Morten Nobel-Jørgensen <m...@dtu.dk>
> wrote:
>
>> Dear PETSc developers and users,
>>
>> Last week we posted a question regarding an error with DMPlex and
>> multiple dofs and have not gotten any feedback yet. This is uncharted
>> waters for us, since we have gotten used to an extremely fast feedback from
>> the PETSc crew. So - with the chance of sounding impatient and ungrateful -
>> we would like to hear if anybody has any ideas that could point us in the
>> right direction?
>>
>
> This is my fault. You have not gotten a response because everyone else was
> waiting for me, and I have been
> slow because I just moved houses at the same time as term started here.
> Sorry about that.
>
> The example ran for me and I saw your problem. The local-tp-global map is
> missing for some reason.
> I am tracking it down now. It should be made by DMCreateMatrix(), so this
> is mysterious. I hope to have
> this fixed by early next week.
>
>   Thanks,
>
> Matt
>
>
>> We have created a small example problem that demonstrates the error in
>> the matrix assembly.
>>
>> Thanks,
>> Morten
>>
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener


Re: [petsc-users] DMPlex problem

2016-09-23 Thread Morten Nobel-Jørgensen
Dear PETSc developers

Any update on this issue regarding DMPlex? Or is there any obvious workaround 
that we are unaware of?


Also should we additionally register the issue on Bitbucket or is reporting the 
issue on the mailing list enough?

Kind regards,
Morten


From: Matthew Knepley [knep...@gmail.com]
Sent: Friday, September 09, 2016 12:21 PM
To: Morten Nobel-Jørgensen
Cc: PETSc ‎[petsc-users@mcs.anl.gov]‎
Subject: Re: [petsc-users] DMPlex problem

On Fri, Sep 9, 2016 at 4:04 AM, Morten Nobel-Jørgensen 
<m...@dtu.dk<mailto:m...@dtu.dk>> wrote:
Dear PETSc developers and users,

Last week we posted a question regarding an error with DMPlex and multiple dofs 
and have not gotten any feedback yet. This is uncharted waters for us, since we 
have gotten used to an extremely fast feedback from the PETSc crew. So - with 
the chance of sounding impatient and ungrateful - we would like to hear if 
anybody has any ideas that could point us in the right direction?

This is my fault. You have not gotten a response because everyone else was 
waiting for me, and I have been
slow because I just moved houses at the same time as term started here. Sorry 
about that.

The example ran for me and I saw your problem. The local-tp-global map is 
missing for some reason.
I am tracking it down now. It should be made by DMCreateMatrix(), so this is 
mysterious. I hope to have
this fixed by early next week.

  Thanks,

Matt

We have created a small example problem that demonstrates the error in the 
matrix assembly.

Thanks,
Morten





--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener


Re: [petsc-users] DMPlex problem

2016-09-09 Thread Matthew Knepley
On Fri, Sep 9, 2016 at 9:49 AM, Mark Lohry  wrote:

> Regarding DMPlex, I'm unclear on the separation between Index Sets and
> Plex for unstructured solvers. I've got an existing unstructured serial
> solver using petsc's newton-krylov
>
 Index Sets are lists of integers. They can mean whatever you want, but
they are just lists.

Plex stores mesh topology and has topological query functions.

> solvers. Should I be looking to parallelize this via IS or plex? Is there
> an interface for either that allows me to specify the partitioning (e.g. by
> metis' output)? Last but not least, is
>
There is DMPlexDistribute() which partitions and distributes, or you can
use DMPlexMigrates() if you want to calculate
your own partition for some reason.

> there a working example of dmplex in the docs? The only unstructured code
> example I see in the docs is SNES ex10, which uses IS.
>
SNES ex12, ex62, ex77 and TS ex11.

  Thanks,

Matt

> Thanks,
>
> Mark
>
> On 09/09/2016 04:21 AM, Matthew Knepley wrote:
>
> On Fri, Sep 9, 2016 at 4:04 AM, Morten Nobel-Jørgensen 
> wrote:
>
>> Dear PETSc developers and users,
>>
>> Last week we posted a question regarding an error with DMPlex and
>> multiple dofs and have not gotten any feedback yet. This is uncharted
>> waters for us, since we have gotten used to an extremely fast feedback from
>> the PETSc crew. So - with the chance of sounding impatient and ungrateful -
>> we would like to hear if anybody has any ideas that could point us in the
>> right direction?
>>
>
> This is my fault. You have not gotten a response because everyone else was
> waiting for me, and I have been
> slow because I just moved houses at the same time as term started here.
> Sorry about that.
>
> The example ran for me and I saw your problem. The local-tp-global map is
> missing for some reason.
> I am tracking it down now. It should be made by DMCreateMatrix(), so this
> is mysterious. I hope to have
> this fixed by early next week.
>
>   Thanks,
>
> Matt
>
>
>> We have created a small example problem that demonstrates the error in
>> the matrix assembly.
>>
>> Thanks,
>> Morten
>>
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener


Re: [petsc-users] DMPlex problem

2016-09-09 Thread Mark Lohry
Regarding DMPlex, I'm unclear on the separation between Index Sets and 
Plex for unstructured solvers. I've got an existing unstructured serial 
solver using petsc's newton-krylov solvers. Should I be looking to 
parallelize this via IS or plex? Is there an interface for either that 
allows me to specify the partitioning (e.g. by metis' output)? Last but 
not least, is there a working example of dmplex in the docs? The only 
unstructured code example I see in the docs is SNES ex10, which uses IS.


Thanks,

Mark


On 09/09/2016 04:21 AM, Matthew Knepley wrote:
On Fri, Sep 9, 2016 at 4:04 AM, Morten Nobel-Jørgensen > wrote:


Dear PETSc developers and users,

Last week we posted a question regarding an error with DMPlex and
multiple dofs and have not gotten any feedback yet. This is
uncharted waters for us, since we have gotten used to an extremely
fast feedback from the PETSc crew. So - with the chance of
sounding impatient and ungrateful - we would like to hear if
anybody has any ideas that could point us in the right direction?


This is my fault. You have not gotten a response because everyone else 
was waiting for me, and I have been
slow because I just moved houses at the same time as term started 
here. Sorry about that.


The example ran for me and I saw your problem. The local-tp-global map 
is missing for some reason.
I am tracking it down now. It should be made by DMCreateMatrix(), so 
this is mysterious. I hope to have

this fixed by early next week.

  Thanks,

Matt

We have created a small example problem that demonstrates the
error in the matrix assembly.

Thanks,
Morten





--
What most experimenters take for granted before they begin their 
experiments is infinitely more interesting than any results to which 
their experiments lead.

-- Norbert Wiener




Re: [petsc-users] DMPlex problem

2016-09-09 Thread Matthew Knepley
On Fri, Sep 9, 2016 at 4:04 AM, Morten Nobel-Jørgensen  wrote:

> Dear PETSc developers and users,
>
> Last week we posted a question regarding an error with DMPlex and multiple
> dofs and have not gotten any feedback yet. This is uncharted waters for us,
> since we have gotten used to an extremely fast feedback from the PETSc
> crew. So - with the chance of sounding impatient and ungrateful - we would
> like to hear if anybody has any ideas that could point us in the right
> direction?
>

This is my fault. You have not gotten a response because everyone else was
waiting for me, and I have been
slow because I just moved houses at the same time as term started here.
Sorry about that.

The example ran for me and I saw your problem. The local-tp-global map is
missing for some reason.
I am tracking it down now. It should be made by DMCreateMatrix(), so this
is mysterious. I hope to have
this fixed by early next week.

  Thanks,

Matt


> We have created a small example problem that demonstrates the error in the
> matrix assembly.
>
> Thanks,
> Morten
>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener