Re: [petsc-users] error running parallel on cluster

2018-04-18 Thread Sepideh Kavousi
It is really a pleasure for me.
Thanks,
Sepidwh

Get Outlook for Android


From: Dave May 
Sent: Wednesday, April 18, 2018 11:14:47 PM
To: Matthew Knepley
Cc: Sepideh Kavousi; petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] error running parallel on cluster



On 18 April 2018 at 23:52, Matthew Knepley 
> wrote:
On Wed, Apr 18, 2018 at 5:52 PM, Sepideh Kavousi 
> wrote:

Mathew and Dave,

Thank you so much it is working perfectly now.

Excellent.

If you want your name to appear on the next PETSc release as a contributor, you
can make a PR with this change :)

Here is a URL describing the PR's protocol for PETSc contribs:

https://bitbucket.org/petsc/petsc/wiki/pull-request-instructions-git



  Thanks,

 Matt


Sepideh


From: Dave May >
Sent: Wednesday, April 18, 2018 3:13:33 PM
To: Sepideh Kavousi
Cc: Matthew Knepley; petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] error running parallel on cluster



On 18 April 2018 at 21:06, Sepideh Kavousi 
> wrote:

Mathew

I added the lines and I still have the same issue. It may be a silly question 
but should I configure and install petsc again using this new lines added? or 
changing the line is enough? the highlighted lines are the lines I modified.


PetscErrorCode ierr;
  DM dm;
  DMTS_DA*dmdats = (DMTS_DA*)ctx;
  DMDALocalInfo  info;
  VecXloc,Xdotloc;
  void   *x,*f,*xdot;

  PetscFunctionBegin;
  PetscValidHeaderSpecific(ts,TS_CLASSID,1);
  PetscValidHeaderSpecific(X,VEC_CLASSID,2);
  PetscValidHeaderSpecific(F,VEC_CLASSID,3);
  if (!dmdats->ifunctionlocal) 
SETERRQ(PetscObjectComm((PetscObject)ts),PETSC_ERR_PLIB,"Corrupt context");
  ierr = TSGetDM(ts,);CHKERRQ(ierr);
  ierr = DMGetLocalVector(dm,);CHKERRQ(ierr);
  ierr = DMGlobalToLocalBegin(dm,Xdot,INSERT_VALUES,Xdotloc);CHKERRQ(ierr);
  ierr = DMGlobalToLocalEnd(dm,Xdot,INSERT_VALUES,Xdotloc);CHKERRQ(ierr);
  ierr = DMGetLocalVector(dm,);CHKERRQ(ierr);
  ierr = DMGlobalToLocalBegin(dm,X,INSERT_VALUES,Xloc);CHKERRQ(ierr);
  ierr = DMGlobalToLocalEnd(dm,X,INSERT_VALUES,Xloc);CHKERRQ(ierr);
  ierr = DMDAGetLocalInfo(dm,);CHKERRQ(ierr);
  ierr = DMDAVecGetArray(dm,Xloc,);CHKERRQ(ierr);
  ierr = DMDAVecGetArray(dm,Xdotloc,);CHKERRQ(ierr);

Don't forget to include these calls (in this order) after you are done with the 
Xdotloc vector

ierr = DMDAVecRestoreArray(dm,Xdotloc,);CHKERRQ(ierr);
ierr = DMRestoreLocalVector(dm,);CHKERRQ(ierr);

Failure to do so will result in a memory leak.



Thanks,

Sepideh


From: Matthew Knepley >
Sent: Tuesday, April 17, 2018 5:59:12 PM

To: Sepideh Kavousi
Cc: petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] error running parallel on cluster

On Tue, Apr 17, 2018 at 3:07 PM, Sepideh Kavousi 
> wrote:

The reason  I can not use your method is that the input arguments of 
SetIFunctionLocal are the arrays of x,x_t instead of x,x_t vectors. In your 
method which was:

ierr = DMGetLocalVector(dm,);CHKERRQ(ierr);
  ierr = DMGlobalToLocalBegin(dm,Xdot,INSERT_VALUES,Xdotloc);CHKERRQ(ierr);
  ierr = DMGlobalToLocalEnd(dm,Xdot,INSERT_VALUES,Xdotloc);CHKERRQ(ierr);
  ierr = DMDAVecGetArray(dm,Xdotloc,);CHKERRQ(ierr);

You misunderstand my suggestion. I mean stick this code in right here in PETSc

https://bitbucket.org/petsc/petsc/annotate/be3efd428a942676a0189b3273b3c582694ff011/src/ts/utils/dmdats.c?at=master=file-view-default#dmdats.c-68

Then the X_t array you get in your local function will be ghosted.

   Matt


I need to have the vector of Xdot, not the array. So I think I should use 
SetIFunction instead of SetIFunctionLocal.


Sepideh


From: Matthew Knepley >
Sent: Tuesday, April 17, 2018 1:22:53 PM
To: Sepideh Kavousi
Cc: petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] error running parallel on cluster

On Tue, Apr 17, 2018 at 1:50 PM, Sepideh Kavousi 
> wrote:

Mathew,

I previously use 
DMDATSSetIFunctionLocal(user.da,INSERT_VALUES,(DMDATSIFunctionLocal) 
FormFunction,) in my code.  If I want to use your solution I can not use 
it because in the FormFunction definition I must use arrays, not vectors.So to 
solve this issue I followed two methods where none were able to solve it.
1- in first method I decided to use TSSetIFunction instead of 
DMDATSSetIFunctionLocal

for this means first in the main function, I use TSSetDM and  my form function 
variables were as:
PetscErrorCode FormFunction(TS ts,PetscScalar 

Re: [petsc-users] error running parallel on cluster

2018-04-18 Thread Dave May
On 18 April 2018 at 23:52, Matthew Knepley  wrote:

> On Wed, Apr 18, 2018 at 5:52 PM, Sepideh Kavousi  wrote:
>
>> Mathew and Dave,
>>
>> Thank you so much it is working perfectly now.
>>
>
> Excellent.
>
> If you want your name to appear on the next PETSc release as a
> contributor, you
> can make a PR with this change :)
>

Here is a URL describing the PR's protocol for PETSc contribs:

https://bitbucket.org/petsc/petsc/wiki/pull-request-instructions-git



>
>   Thanks,
>
>  Matt
>
>
>> Sepideh
>> --
>> *From:* Dave May 
>> *Sent:* Wednesday, April 18, 2018 3:13:33 PM
>> *To:* Sepideh Kavousi
>> *Cc:* Matthew Knepley; petsc-users@mcs.anl.gov
>> *Subject:* Re: [petsc-users] error running parallel on cluster
>>
>>
>>
>> On 18 April 2018 at 21:06, Sepideh Kavousi  wrote:
>>
>> Mathew
>>
>> I added the lines and I still have the same issue. It may be a silly
>> question but should I configure and install petsc again using this new
>> lines added? or changing the line is enough? the highlighted lines are the
>> lines I modified.
>>
>>
>> PetscErrorCode ierr;
>>   DM dm;
>>   DMTS_DA*dmdats = (DMTS_DA*)ctx;
>>   DMDALocalInfo  info;
>>   VecXloc,Xdotloc;
>>   void   *x,*f,*xdot;
>>
>>   PetscFunctionBegin;
>>   PetscValidHeaderSpecific(ts,TS_CLASSID,1);
>>   PetscValidHeaderSpecific(X,VEC_CLASSID,2);
>>   PetscValidHeaderSpecific(F,VEC_CLASSID,3);
>>   if (!dmdats->ifunctionlocal) SETERRQ(PetscObjectComm((Petsc
>> Object)ts),PETSC_ERR_PLIB,"Corrupt context");
>>   ierr = TSGetDM(ts,);CHKERRQ(ierr);
>>   ierr = DMGetLocalVector(dm,);CHKERRQ(ierr);
>>   ierr = DMGlobalToLocalBegin(dm,Xdot,INSERT_VALUES,Xdotloc);CHKERRQ(
>> ierr);
>>   ierr = DMGlobalToLocalEnd(dm,Xdot,INSERT_VALUES,Xdotloc);CHKERRQ(ierr);
>>
>>   ierr = DMGetLocalVector(dm,);CHKERRQ(ierr);
>>   ierr = DMGlobalToLocalBegin(dm,X,INSERT_VALUES,Xloc);CHKERRQ(ierr);
>>   ierr = DMGlobalToLocalEnd(dm,X,INSERT_VALUES,Xloc);CHKERRQ(ierr);
>>   ierr = DMDAGetLocalInfo(dm,);CHKERRQ(ierr);
>>   ierr = DMDAVecGetArray(dm,Xloc,);CHKERRQ(ierr);
>>   ierr = DMDAVecGetArray(dm,Xdotloc,);CHKERRQ(ierr);
>>
>>
>> Don't forget to include these calls (in this order) after you are done
>> with the Xdotloc vector
>>
>> ierr = DMDAVecRestoreArray(dm,Xdotloc,);CHKERRQ(ierr);
>> ierr = DMRestoreLocalVector(dm,);CHKERRQ(ierr);
>>
>> Failure to do so will result in a memory leak.
>>
>>
>>
>> Thanks,
>>
>> Sepideh
>> --
>> *From:* Matthew Knepley 
>> *Sent:* Tuesday, April 17, 2018 5:59:12 PM
>>
>> *To:* Sepideh Kavousi
>> *Cc:* petsc-users@mcs.anl.gov
>> *Subject:* Re: [petsc-users] error running parallel on cluster
>>
>> On Tue, Apr 17, 2018 at 3:07 PM, Sepideh Kavousi  wrote:
>>
>> The reason  I can not use your method is that the input arguments of
>> SetIFunctionLocal are the arrays of x,x_t instead of x,x_t vectors. In your
>> method which was:
>>
>> ierr *=* DMGetLocalVector(dm,*&*Xdotloc);CHKERRQ(ierr);  ierr *=* 
>> DMGlobalToLocalBegin(dm,Xdot,INSERT_VALUES,Xdotloc);CHKERRQ(ierr);  ierr *=* 
>> DMGlobalToLocalEnd(dm,Xdot,INSERT_VALUES,Xdotloc);CHKERRQ(ierr);
>>   ierr *=* DMDAVecGetArray(dm,Xdotloc,*&*xdot);CHKERRQ(ierr);
>>
>>
>> You misunderstand my suggestion. I mean stick this code in right here in
>> PETSc
>>
>> https://bitbucket.org/petsc/petsc/annotate/be3efd428a942676a
>> 0189b3273b3c582694ff011/src/ts/utils/dmdats.c?at=master
>> viewer=file-view-default#dmdats.c-68
>>
>> Then the X_t array you get in your local function will be ghosted.
>>
>>Matt
>>
>>
>> I need to have the vector of Xdot, not the array. So I think I should use
>> SetIFunction instead of SetIFunctionLocal.
>>
>>
>> Sepideh
>> --
>> *From:* Matthew Knepley 
>> *Sent:* Tuesday, April 17, 2018 1:22:53 PM
>> *To:* Sepideh Kavousi
>> *Cc:* petsc-users@mcs.anl.gov
>> *Subject:* Re: [petsc-users] error running parallel on cluster
>>
>> On Tue, Apr 17, 2018 at 1:50 PM, Sepideh Kavousi  wrote:
>>
>> Mathew,
>> I previously use DMDATSSetIFunctionLocal(user.d
>> a,INSERT_VALUES,(DMDATSIFunctionLocal) FormFunction,) in my code.
>> If I want to use your solution I can not use it because in the FormFunction
>> definition I must use arrays, not vectors.So to solve this issue I followed
>> two methods where none were able to solve it.
>> 1- in first method I decided to use TSSetIFunction instead of
>> DMDATSSetIFunctionLocal
>>
>> for this means first in the main function, I use TSSetDM and  my form
>> function variables were as:
>> PetscErrorCode FormFunction(TS ts,PetscScalar t,Vec Y,Vec Ydot,Vec F,
>> struct VAR_STRUCT *user) {
>> .
>> .
>> .
>> .
>> ierr = TSGetDM(ts,);CHKERRQ(ierr);
>> ierr= DMDAGetLocalInfo(dmda,) ;CHKERRQ(ierr);
>>
>> ierr = DMGetLocalVector(dmda,_local);CHKERRQ(ierr);
>> ierr = 

Re: [petsc-users] error running parallel on cluster

2018-04-18 Thread Matthew Knepley
On Wed, Apr 18, 2018 at 5:52 PM, Sepideh Kavousi  wrote:

> Mathew and Dave,
>
> Thank you so much it is working perfectly now.
>

Excellent.

If you want your name to appear on the next PETSc release as a contributor,
you
can make a PR with this change :)

  Thanks,

 Matt


> Sepideh
> --
> *From:* Dave May 
> *Sent:* Wednesday, April 18, 2018 3:13:33 PM
> *To:* Sepideh Kavousi
> *Cc:* Matthew Knepley; petsc-users@mcs.anl.gov
> *Subject:* Re: [petsc-users] error running parallel on cluster
>
>
>
> On 18 April 2018 at 21:06, Sepideh Kavousi  wrote:
>
> Mathew
>
> I added the lines and I still have the same issue. It may be a silly
> question but should I configure and install petsc again using this new
> lines added? or changing the line is enough? the highlighted lines are the
> lines I modified.
>
>
> PetscErrorCode ierr;
>   DM dm;
>   DMTS_DA*dmdats = (DMTS_DA*)ctx;
>   DMDALocalInfo  info;
>   VecXloc,Xdotloc;
>   void   *x,*f,*xdot;
>
>   PetscFunctionBegin;
>   PetscValidHeaderSpecific(ts,TS_CLASSID,1);
>   PetscValidHeaderSpecific(X,VEC_CLASSID,2);
>   PetscValidHeaderSpecific(F,VEC_CLASSID,3);
>   if (!dmdats->ifunctionlocal) SETERRQ(PetscObjectComm((Petsc
> Object)ts),PETSC_ERR_PLIB,"Corrupt context");
>   ierr = TSGetDM(ts,);CHKERRQ(ierr);
>   ierr = DMGetLocalVector(dm,);CHKERRQ(ierr);
>   ierr = DMGlobalToLocalBegin(dm,Xdot,INSERT_VALUES,Xdotloc);CHKERRQ(
> ierr);
>   ierr = DMGlobalToLocalEnd(dm,Xdot,INSERT_VALUES,Xdotloc);CHKERRQ(ierr);
>   ierr = DMGetLocalVector(dm,);CHKERRQ(ierr);
>   ierr = DMGlobalToLocalBegin(dm,X,INSERT_VALUES,Xloc);CHKERRQ(ierr);
>   ierr = DMGlobalToLocalEnd(dm,X,INSERT_VALUES,Xloc);CHKERRQ(ierr);
>   ierr = DMDAGetLocalInfo(dm,);CHKERRQ(ierr);
>   ierr = DMDAVecGetArray(dm,Xloc,);CHKERRQ(ierr);
>   ierr = DMDAVecGetArray(dm,Xdotloc,);CHKERRQ(ierr);
>
>
> Don't forget to include these calls (in this order) after you are done
> with the Xdotloc vector
>
> ierr = DMDAVecRestoreArray(dm,Xdotloc,);CHKERRQ(ierr);
> ierr = DMRestoreLocalVector(dm,);CHKERRQ(ierr);
>
> Failure to do so will result in a memory leak.
>
>
>
> Thanks,
>
> Sepideh
> --
> *From:* Matthew Knepley 
> *Sent:* Tuesday, April 17, 2018 5:59:12 PM
>
> *To:* Sepideh Kavousi
> *Cc:* petsc-users@mcs.anl.gov
> *Subject:* Re: [petsc-users] error running parallel on cluster
>
> On Tue, Apr 17, 2018 at 3:07 PM, Sepideh Kavousi  wrote:
>
> The reason  I can not use your method is that the input arguments of
> SetIFunctionLocal are the arrays of x,x_t instead of x,x_t vectors. In your
> method which was:
>
> ierr *=* DMGetLocalVector(dm,*&*Xdotloc);CHKERRQ(ierr);  ierr *=* 
> DMGlobalToLocalBegin(dm,Xdot,INSERT_VALUES,Xdotloc);CHKERRQ(ierr);  ierr *=* 
> DMGlobalToLocalEnd(dm,Xdot,INSERT_VALUES,Xdotloc);CHKERRQ(ierr);
>   ierr *=* DMDAVecGetArray(dm,Xdotloc,*&*xdot);CHKERRQ(ierr);
>
>
> You misunderstand my suggestion. I mean stick this code in right here in
> PETSc
>
> https://bitbucket.org/petsc/petsc/annotate/be3efd428a942676a
> 0189b3273b3c582694ff011/src/ts/utils/dmdats.c?at=master
> viewer=file-view-default#dmdats.c-68
>
> Then the X_t array you get in your local function will be ghosted.
>
>Matt
>
>
> I need to have the vector of Xdot, not the array. So I think I should use
> SetIFunction instead of SetIFunctionLocal.
>
>
> Sepideh
> --
> *From:* Matthew Knepley 
> *Sent:* Tuesday, April 17, 2018 1:22:53 PM
> *To:* Sepideh Kavousi
> *Cc:* petsc-users@mcs.anl.gov
> *Subject:* Re: [petsc-users] error running parallel on cluster
>
> On Tue, Apr 17, 2018 at 1:50 PM, Sepideh Kavousi  wrote:
>
> Mathew,
> I previously use DMDATSSetIFunctionLocal(user.d
> a,INSERT_VALUES,(DMDATSIFunctionLocal) FormFunction,) in my code.
> If I want to use your solution I can not use it because in the FormFunction
> definition I must use arrays, not vectors.So to solve this issue I followed
> two methods where none were able to solve it.
> 1- in first method I decided to use TSSetIFunction instead of
> DMDATSSetIFunctionLocal
>
> for this means first in the main function, I use TSSetDM and  my form
> function variables were as:
> PetscErrorCode FormFunction(TS ts,PetscScalar t,Vec Y,Vec Ydot,Vec F,
> struct VAR_STRUCT *user) {
> .
> .
> .
> .
> ierr = TSGetDM(ts,);CHKERRQ(ierr);
> ierr= DMDAGetLocalInfo(dmda,) ;CHKERRQ(ierr);
>
> ierr = DMGetLocalVector(dmda,_local);CHKERRQ(ierr);
> ierr = DMGlobalToLocalBegin(dmda,Ydot,INSERT_VALUES,Ydot_local);CHK
> ERRQ(ierr);
> ierr = DMGlobalToLocalEnd(dmda,Ydot,INSERT_VALUES,Ydot_local);CHKER
> RQ(ierr);
> .
> .
> .
>
> }
> But still, it does not consider vectors y,ydot,f related to dmda (problem
> executing DMDAVecGetArray)
>
>
> We cannot help you if you do not show full error messages.
>
> Why not fix the 

Re: [petsc-users] error running parallel on cluster

2018-04-18 Thread Sepideh Kavousi
Mathew and Dave,

Thank you so much it is working perfectly now.

Sepideh


From: Dave May 
Sent: Wednesday, April 18, 2018 3:13:33 PM
To: Sepideh Kavousi
Cc: Matthew Knepley; petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] error running parallel on cluster



On 18 April 2018 at 21:06, Sepideh Kavousi 
> wrote:

Mathew

I added the lines and I still have the same issue. It may be a silly question 
but should I configure and install petsc again using this new lines added? or 
changing the line is enough? the highlighted lines are the lines I modified.


PetscErrorCode ierr;
  DM dm;
  DMTS_DA*dmdats = (DMTS_DA*)ctx;
  DMDALocalInfo  info;
  VecXloc,Xdotloc;
  void   *x,*f,*xdot;

  PetscFunctionBegin;
  PetscValidHeaderSpecific(ts,TS_CLASSID,1);
  PetscValidHeaderSpecific(X,VEC_CLASSID,2);
  PetscValidHeaderSpecific(F,VEC_CLASSID,3);
  if (!dmdats->ifunctionlocal) 
SETERRQ(PetscObjectComm((PetscObject)ts),PETSC_ERR_PLIB,"Corrupt context");
  ierr = TSGetDM(ts,);CHKERRQ(ierr);
  ierr = DMGetLocalVector(dm,);CHKERRQ(ierr);
  ierr = DMGlobalToLocalBegin(dm,Xdot,INSERT_VALUES,Xdotloc);CHKERRQ(ierr);
  ierr = DMGlobalToLocalEnd(dm,Xdot,INSERT_VALUES,Xdotloc);CHKERRQ(ierr);
  ierr = DMGetLocalVector(dm,);CHKERRQ(ierr);
  ierr = DMGlobalToLocalBegin(dm,X,INSERT_VALUES,Xloc);CHKERRQ(ierr);
  ierr = DMGlobalToLocalEnd(dm,X,INSERT_VALUES,Xloc);CHKERRQ(ierr);
  ierr = DMDAGetLocalInfo(dm,);CHKERRQ(ierr);
  ierr = DMDAVecGetArray(dm,Xloc,);CHKERRQ(ierr);
  ierr = DMDAVecGetArray(dm,Xdotloc,);CHKERRQ(ierr);

Don't forget to include these calls (in this order) after you are done with the 
Xdotloc vector

ierr = DMDAVecRestoreArray(dm,Xdotloc,);CHKERRQ(ierr);
ierr = DMRestoreLocalVector(dm,);CHKERRQ(ierr);

Failure to do so will result in a memory leak.



Thanks,

Sepideh


From: Matthew Knepley >
Sent: Tuesday, April 17, 2018 5:59:12 PM

To: Sepideh Kavousi
Cc: petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] error running parallel on cluster

On Tue, Apr 17, 2018 at 3:07 PM, Sepideh Kavousi 
> wrote:

The reason  I can not use your method is that the input arguments of 
SetIFunctionLocal are the arrays of x,x_t instead of x,x_t vectors. In your 
method which was:

ierr = DMGetLocalVector(dm,);CHKERRQ(ierr);
  ierr = DMGlobalToLocalBegin(dm,Xdot,INSERT_VALUES,Xdotloc);CHKERRQ(ierr);
  ierr = DMGlobalToLocalEnd(dm,Xdot,INSERT_VALUES,Xdotloc);CHKERRQ(ierr);
  ierr = DMDAVecGetArray(dm,Xdotloc,);CHKERRQ(ierr);

You misunderstand my suggestion. I mean stick this code in right here in PETSc

https://bitbucket.org/petsc/petsc/annotate/be3efd428a942676a0189b3273b3c582694ff011/src/ts/utils/dmdats.c?at=master=file-view-default#dmdats.c-68

Then the X_t array you get in your local function will be ghosted.

   Matt


I need to have the vector of Xdot, not the array. So I think I should use 
SetIFunction instead of SetIFunctionLocal.


Sepideh


From: Matthew Knepley >
Sent: Tuesday, April 17, 2018 1:22:53 PM
To: Sepideh Kavousi
Cc: petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] error running parallel on cluster

On Tue, Apr 17, 2018 at 1:50 PM, Sepideh Kavousi 
> wrote:

Mathew,

I previously use 
DMDATSSetIFunctionLocal(user.da,INSERT_VALUES,(DMDATSIFunctionLocal) 
FormFunction,) in my code.  If I want to use your solution I can not use 
it because in the FormFunction definition I must use arrays, not vectors.So to 
solve this issue I followed two methods where none were able to solve it.
1- in first method I decided to use TSSetIFunction instead of 
DMDATSSetIFunctionLocal

for this means first in the main function, I use TSSetDM and  my form function 
variables were as:
PetscErrorCode FormFunction(TS ts,PetscScalar t,Vec Y,Vec Ydot,Vec F, struct 
VAR_STRUCT *user) {
.
.
.
.
ierr = TSGetDM(ts,);CHKERRQ(ierr);
ierr= DMDAGetLocalInfo(dmda,) ;CHKERRQ(ierr);

ierr = DMGetLocalVector(dmda,_local);CHKERRQ(ierr);
ierr = 
DMGlobalToLocalBegin(dmda,Ydot,INSERT_VALUES,Ydot_local);CHKERRQ(ierr);
ierr = DMGlobalToLocalEnd(dmda,Ydot,INSERT_VALUES,Ydot_local);CHKERRQ(ierr);
.
.
.

}
But still, it does not consider vectors y,ydot,f related to dmda (problem 
executing DMDAVecGetArray)

We cannot help you if you do not show full error messages.

Why not fix the code with SetIFunctionLocal(), as I said in my last email. I 
will fix PETSc proper in branch at the end of the week. I
have a proposal due tomorrow, so I cannot do it right now.

  Thanks,

Matt

2- in second method I decided to use DMTSSetIFunction
but still, FormFunction is in form of TSIFunction where we do not define dm 

Re: [petsc-users] Matrix and vector type selection & memory allocation for efficient matrix import?

2018-04-18 Thread k_burk...@yahoo.com
So, practically speaking, l should invent routines to decompose the matrix e.g. into a block matrix structure to be able to make real use of PETSc ie. be able to solve a linear system using more than one process/core?KlausVon meinem Huawei-Mobiltelefon gesendet Originalnachricht Betreff: Re: [petsc-users] Matrix and vector type selection & memory allocation for efficient matrix import?Von: "Smith, Barry F." An: Klaus Burkart Cc: PETSc Users List   If you can only generate the nonzero allocation sequentially you can only solve sequentially which means your matrix is MATSEQAIJ and your vector is VECSEQ and your communicator is PETSC_COMM_SELF.   If you pass and array for nnz, what you pass for nz is irrelevant, you might as well pass 0.   Barry> On Apr 18, 2018, at 10:48 AM, Klaus Burkart  wrote:> > More questions about matrix and vector type selection for my application:> > My starting point is a huge sparse matrix which can be symmetric or asymmetric and a rhs vector. There's no defined local or block structure at all, just row and column indices and the values and an array style rhs vector together describing the entire linear system to be solved. With quite some effort, I should be able to create an array nnz[N] containing the number of nonzeros per row in the global matrix for memory allocation which would leave me with MatSeqAIJSetPreallocation(M, 0, nnz); as the only option for efficient memory allocation ie. a MATSEQAIJ matrix and VECSEQ. I assume here, that 0 indicates different numbers of nonzero values in each row, the exact number being stored in the nnz array. Regarding this detail but one example assume a constant number of nz per row so I am not sure whether I should write 0 or NULL for nz?> > I started with:> > MatCreate(PETSC_COMM_WORLD, );> MatSetSizes(M, PETSC_DECIDE, PETSC_DECIDE, N, N);> MatSetFromOptions(M);> > taken from a paper and assume, the latter would set the matrix type to MATSEQAIJ which might conflict with PETSC_COMM_WORLD. Maybe decompositioning took place at an earlier stage and the authors of the paper were able to retrieve the local data and structure. > > What type of matrix and vector should I use for my application e.g. MATSEQAIJ and VECSEQ to be able to use MatSeqAIJSetPreallocation(M, 0, nnz); for efficient memory allocation?> > In this case, where would the  decompositioning / MPI process allocation take place?

Re: [petsc-users] error running parallel on cluster

2018-04-18 Thread Dave May
On 18 April 2018 at 21:06, Sepideh Kavousi  wrote:

> Mathew
>
> I added the lines and I still have the same issue. It may be a silly
> question but should I configure and install petsc again using this new
> lines added? or changing the line is enough? the highlighted lines are the
> lines I modified.
>
>
> PetscErrorCode ierr;
>   DM dm;
>   DMTS_DA*dmdats = (DMTS_DA*)ctx;
>   DMDALocalInfo  info;
>   VecXloc,Xdotloc;
>   void   *x,*f,*xdot;
>
>   PetscFunctionBegin;
>   PetscValidHeaderSpecific(ts,TS_CLASSID,1);
>   PetscValidHeaderSpecific(X,VEC_CLASSID,2);
>   PetscValidHeaderSpecific(F,VEC_CLASSID,3);
>   if (!dmdats->ifunctionlocal) SETERRQ(PetscObjectComm((
> PetscObject)ts),PETSC_ERR_PLIB,"Corrupt context");
>   ierr = TSGetDM(ts,);CHKERRQ(ierr);
>   ierr = DMGetLocalVector(dm,);CHKERRQ(ierr);
>   ierr = DMGlobalToLocalBegin(dm,Xdot,INSERT_VALUES,Xdotloc);
> CHKERRQ(ierr);
>   ierr = DMGlobalToLocalEnd(dm,Xdot,INSERT_VALUES,Xdotloc);CHKERRQ(ierr);
>   ierr = DMGetLocalVector(dm,);CHKERRQ(ierr);
>   ierr = DMGlobalToLocalBegin(dm,X,INSERT_VALUES,Xloc);CHKERRQ(ierr);
>   ierr = DMGlobalToLocalEnd(dm,X,INSERT_VALUES,Xloc);CHKERRQ(ierr);
>   ierr = DMDAGetLocalInfo(dm,);CHKERRQ(ierr);
>   ierr = DMDAVecGetArray(dm,Xloc,);CHKERRQ(ierr);
>   ierr = DMDAVecGetArray(dm,Xdotloc,);CHKERRQ(ierr);
>
>
Don't forget to include these calls (in this order) after you are done with
the Xdotloc vector

ierr = DMDAVecRestoreArray(dm,Xdotloc,);CHKERRQ(ierr);
ierr = DMRestoreLocalVector(dm,);CHKERRQ(ierr);

Failure to do so will result in a memory leak.



> Thanks,
>
> Sepideh
> --
> *From:* Matthew Knepley 
> *Sent:* Tuesday, April 17, 2018 5:59:12 PM
>
> *To:* Sepideh Kavousi
> *Cc:* petsc-users@mcs.anl.gov
> *Subject:* Re: [petsc-users] error running parallel on cluster
>
> On Tue, Apr 17, 2018 at 3:07 PM, Sepideh Kavousi  wrote:
>
> The reason  I can not use your method is that the input arguments of
> SetIFunctionLocal are the arrays of x,x_t instead of x,x_t vectors. In your
> method which was:
>
> ierr *=* DMGetLocalVector(dm,*&*Xdotloc);CHKERRQ(ierr);  ierr *=* 
> DMGlobalToLocalBegin(dm,Xdot,INSERT_VALUES,Xdotloc);CHKERRQ(ierr);  ierr *=* 
> DMGlobalToLocalEnd(dm,Xdot,INSERT_VALUES,Xdotloc);CHKERRQ(ierr);
>   ierr *=* DMDAVecGetArray(dm,Xdotloc,*&*xdot);CHKERRQ(ierr);
>
>
> You misunderstand my suggestion. I mean stick this code in right here in
> PETSc
>
> https://bitbucket.org/petsc/petsc/annotate/be3efd428a942676a
> 0189b3273b3c582694ff011/src/ts/utils/dmdats.c?at=master
> viewer=file-view-default#dmdats.c-68
>
> Then the X_t array you get in your local function will be ghosted.
>
>Matt
>
>
> I need to have the vector of Xdot, not the array. So I think I should use
> SetIFunction instead of SetIFunctionLocal.
>
>
> Sepideh
> --
> *From:* Matthew Knepley 
> *Sent:* Tuesday, April 17, 2018 1:22:53 PM
> *To:* Sepideh Kavousi
> *Cc:* petsc-users@mcs.anl.gov
> *Subject:* Re: [petsc-users] error running parallel on cluster
>
> On Tue, Apr 17, 2018 at 1:50 PM, Sepideh Kavousi  wrote:
>
> Mathew,
> I previously use DMDATSSetIFunctionLocal(user.d
> a,INSERT_VALUES,(DMDATSIFunctionLocal) FormFunction,) in my code.
> If I want to use your solution I can not use it because in the FormFunction
> definition I must use arrays, not vectors.So to solve this issue I followed
> two methods where none were able to solve it.
> 1- in first method I decided to use TSSetIFunction instead of
> DMDATSSetIFunctionLocal
>
> for this means first in the main function, I use TSSetDM and  my form
> function variables were as:
> PetscErrorCode FormFunction(TS ts,PetscScalar t,Vec Y,Vec Ydot,Vec F,
> struct VAR_STRUCT *user) {
> .
> .
> .
> .
> ierr = TSGetDM(ts,);CHKERRQ(ierr);
> ierr= DMDAGetLocalInfo(dmda,) ;CHKERRQ(ierr);
>
> ierr = DMGetLocalVector(dmda,_local);CHKERRQ(ierr);
> ierr = DMGlobalToLocalBegin(dmda,Ydot,INSERT_VALUES,Ydot_local);CHK
> ERRQ(ierr);
> ierr = DMGlobalToLocalEnd(dmda,Ydot,INSERT_VALUES,Ydot_local);CHKER
> RQ(ierr);
> .
> .
> .
>
> }
> But still, it does not consider vectors y,ydot,f related to dmda (problem
> executing DMDAVecGetArray)
>
>
> We cannot help you if you do not show full error messages.
>
> Why not fix the code with SetIFunctionLocal(), as I said in my last email.
> I will fix PETSc proper in branch at the end of the week. I
> have a proposal due tomorrow, so I cannot do it right now.
>
>   Thanks,
>
> Matt
>
>
> 2- in second method I decided to use DMTSSetIFunction
> but still, FormFunction is in form of TSIFunction where we do not define
> dm object and I think it does not understand dm and da are connected,
> although I have used TSSetDM in the main function.
>
> Can you please help me what should I do?
> Regards,
> Sepideh
>
>
>
>
> --
> *From:* Matthew 

Re: [petsc-users] error running parallel on cluster

2018-04-18 Thread Matthew Knepley
On Wed, Apr 18, 2018 at 4:06 PM, Sepideh Kavousi  wrote:

> Mathew
>
> I added the lines and I still have the same issue. It may be a silly
> question but should I configure and install petsc again using this new
> lines added?
>

No, but you need to make

  cd $PETSC_DIR
  make all

  Thanks,

 Matt


> or changing the line is enough? the highlighted lines are the lines I
> modified.
>
>
> PetscErrorCode ierr;
>   DM dm;
>   DMTS_DA*dmdats = (DMTS_DA*)ctx;
>   DMDALocalInfo  info;
>   VecXloc,Xdotloc;
>   void   *x,*f,*xdot;
>
>   PetscFunctionBegin;
>   PetscValidHeaderSpecific(ts,TS_CLASSID,1);
>   PetscValidHeaderSpecific(X,VEC_CLASSID,2);
>   PetscValidHeaderSpecific(F,VEC_CLASSID,3);
>   if (!dmdats->ifunctionlocal) SETERRQ(PetscObjectComm((
> PetscObject)ts),PETSC_ERR_PLIB,"Corrupt context");
>   ierr = TSGetDM(ts,);CHKERRQ(ierr);
>   ierr = DMGetLocalVector(dm,);CHKERRQ(ierr);
>   ierr = DMGlobalToLocalBegin(dm,Xdot,INSERT_VALUES,Xdotloc);
> CHKERRQ(ierr);
>   ierr = DMGlobalToLocalEnd(dm,Xdot,INSERT_VALUES,Xdotloc);CHKERRQ(ierr);
>   ierr = DMGetLocalVector(dm,);CHKERRQ(ierr);
>   ierr = DMGlobalToLocalBegin(dm,X,INSERT_VALUES,Xloc);CHKERRQ(ierr);
>   ierr = DMGlobalToLocalEnd(dm,X,INSERT_VALUES,Xloc);CHKERRQ(ierr);
>   ierr = DMDAGetLocalInfo(dm,);CHKERRQ(ierr);
>   ierr = DMDAVecGetArray(dm,Xloc,);CHKERRQ(ierr);
>   ierr = DMDAVecGetArray(dm,Xdotloc,);CHKERRQ(ierr);
>
> Thanks,
>
> Sepideh
> --
> *From:* Matthew Knepley 
> *Sent:* Tuesday, April 17, 2018 5:59:12 PM
> *To:* Sepideh Kavousi
> *Cc:* petsc-users@mcs.anl.gov
> *Subject:* Re: [petsc-users] error running parallel on cluster
>
> On Tue, Apr 17, 2018 at 3:07 PM, Sepideh Kavousi  wrote:
>
> The reason  I can not use your method is that the input arguments of
> SetIFunctionLocal are the arrays of x,x_t instead of x,x_t vectors. In your
> method which was:
>
> ierr *=* DMGetLocalVector(dm,*&*Xdotloc);CHKERRQ(ierr);  ierr *=* 
> DMGlobalToLocalBegin(dm,Xdot,INSERT_VALUES,Xdotloc);CHKERRQ(ierr);  ierr *=* 
> DMGlobalToLocalEnd(dm,Xdot,INSERT_VALUES,Xdotloc);CHKERRQ(ierr);
>   ierr *=* DMDAVecGetArray(dm,Xdotloc,*&*xdot);CHKERRQ(ierr);
>
>
> You misunderstand my suggestion. I mean stick this code in right here in
> PETSc
>
> https://bitbucket.org/petsc/petsc/annotate/be3efd428a942676a
> 0189b3273b3c582694ff011/src/ts/utils/dmdats.c?at=master
> viewer=file-view-default#dmdats.c-68
>
> Then the X_t array you get in your local function will be ghosted.
>
>Matt
>
>
> I need to have the vector of Xdot, not the array. So I think I should use
> SetIFunction instead of SetIFunctionLocal.
>
>
> Sepideh
> --
> *From:* Matthew Knepley 
> *Sent:* Tuesday, April 17, 2018 1:22:53 PM
> *To:* Sepideh Kavousi
> *Cc:* petsc-users@mcs.anl.gov
> *Subject:* Re: [petsc-users] error running parallel on cluster
>
> On Tue, Apr 17, 2018 at 1:50 PM, Sepideh Kavousi  wrote:
>
> Mathew,
> I previously use DMDATSSetIFunctionLocal(user.d
> a,INSERT_VALUES,(DMDATSIFunctionLocal) FormFunction,) in my code.
> If I want to use your solution I can not use it because in the FormFunction
> definition I must use arrays, not vectors.So to solve this issue I followed
> two methods where none were able to solve it.
> 1- in first method I decided to use TSSetIFunction instead of
> DMDATSSetIFunctionLocal
>
> for this means first in the main function, I use TSSetDM and  my form
> function variables were as:
> PetscErrorCode FormFunction(TS ts,PetscScalar t,Vec Y,Vec Ydot,Vec F,
> struct VAR_STRUCT *user) {
> .
> .
> .
> .
> ierr = TSGetDM(ts,);CHKERRQ(ierr);
> ierr= DMDAGetLocalInfo(dmda,) ;CHKERRQ(ierr);
>
> ierr = DMGetLocalVector(dmda,_local);CHKERRQ(ierr);
> ierr = DMGlobalToLocalBegin(dmda,Ydot,INSERT_VALUES,Ydot_local);CHK
> ERRQ(ierr);
> ierr = DMGlobalToLocalEnd(dmda,Ydot,INSERT_VALUES,Ydot_local);CHKER
> RQ(ierr);
> .
> .
> .
>
> }
> But still, it does not consider vectors y,ydot,f related to dmda (problem
> executing DMDAVecGetArray)
>
>
> We cannot help you if you do not show full error messages.
>
> Why not fix the code with SetIFunctionLocal(), as I said in my last email.
> I will fix PETSc proper in branch at the end of the week. I
> have a proposal due tomorrow, so I cannot do it right now.
>
>   Thanks,
>
> Matt
>
>
> 2- in second method I decided to use DMTSSetIFunction
> but still, FormFunction is in form of TSIFunction where we do not define
> dm object and I think it does not understand dm and da are connected,
> although I have used TSSetDM in the main function.
>
> Can you please help me what should I do?
> Regards,
> Sepideh
>
>
>
>
> --
> *From:* Matthew Knepley 
> *Sent:* Monday, April 16, 2018 9:34:01 PM
>
> *To:* Sepideh Kavousi
> *Cc:* petsc-users@mcs.anl.gov
> *Subject:* Re: [petsc-users] error 

Re: [petsc-users] Matrix and vector type selection & memory allocation for efficient matrix import?

2018-04-18 Thread Smith, Barry F.

  If you can only generate the nonzero allocation sequentially you can only 
solve sequentially which means your matrix is MATSEQAIJ and your vector is 
VECSEQ and your communicator is PETSC_COMM_SELF.

   If you pass and array for nnz, what you pass for nz is irrelevant, you might 
as well pass 0.

   Barry


> On Apr 18, 2018, at 10:48 AM, Klaus Burkart  wrote:
> 
> More questions about matrix and vector type selection for my application:
> 
> My starting point is a huge sparse matrix which can be symmetric or 
> asymmetric and a rhs vector. There's no defined local or block structure at 
> all, just row and column indices and the values and an array style rhs vector 
> together describing the entire linear system to be solved. With quite some 
> effort, I should be able to create an array nnz[N] containing the number of 
> nonzeros per row in the global matrix for memory allocation which would leave 
> me with MatSeqAIJSetPreallocation(M, 0, nnz); as the only option for 
> efficient memory allocation ie. a MATSEQAIJ matrix and VECSEQ. I assume here, 
> that 0 indicates different numbers of nonzero values in each row, the exact 
> number being stored in the nnz array. Regarding this detail but one example 
> assume a constant number of nz per row so I am not sure whether I should 
> write 0 or NULL for nz?
> 
> I started with:
> 
> MatCreate(PETSC_COMM_WORLD, );
> MatSetSizes(M, PETSC_DECIDE, PETSC_DECIDE, N, N);
> MatSetFromOptions(M);
> 
> taken from a paper and assume, the latter would set the matrix type to 
> MATSEQAIJ which might conflict with PETSC_COMM_WORLD. Maybe decompositioning 
> took place at an earlier stage and the authors of the paper were able to 
> retrieve the local data and structure. 
> 
> What type of matrix and vector should I use for my application e.g. MATSEQAIJ 
> and VECSEQ to be able to use MatSeqAIJSetPreallocation(M, 0, nnz); for 
> efficient memory allocation?
> 
> In this case, where would the  decompositioning / MPI process allocation take 
> place?



Re: [petsc-users] transform in a block matrix

2018-04-18 Thread Matthew Knepley
On Wed, Apr 18, 2018 at 8:01 AM, Sonia Pozzi  wrote:

> Dear Petsc users and developers,
>
> I have a MPIAIJ matrix (already assembled) and I would like to transform
> it in a block matrix with of block dimension 3.
> Is there a easy way to do that?
>

I believe that


http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatConvert.html

will work.

  Thanks,

 Matt


> Thank you in advance,
>
> Sonia
>
>
> Ph.D. Student
> Group of Prof. Rolf Krause
> Institute of Computational Science
> Università della Svizzera italiana
> Center for Computation Medicine in Cardiology
> Via Giuseppe Buffi, 13
> 
> CH-6900 Lugano
> 
> Switzerland
> 
>
> Email: poz...@usi.ch
> Phone: +41 58 666 4972
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/