On Feb 8, 2011, at 10:52 AM, Klaus Zimmermann wrote:

> Hi Barry,
> 
> thanks for your response. I guess my code excerpt wasn't very good, nor was 
> my description. My apologies.
> 
> The evaluation in more details goes like this:
> Depending on the coordinates x and y (and only on them) I calculate 4 vectors 
> S1,...,S4.
> I have a constant matrix C in the appctx. The three quantities I am 
> interested in are then:
> 
> 1) u1 = (x+y)*VecTDot(S1, MatMult(C, S2))
> 2) u2 = u1/(x+y) + (x+y)*VecTDot(S3, MatMult(C, S2))
> 3) u3 = u1/(x+y) + (x+y)*VecTDot(S1, MatMult(C, S4))
> 
   How big are S?  

> With this (hopefully better) description let me answer to your points below 
> individually.
> On 02/08/2011 05:29 PM, Barry Smith wrote:
>> On Feb 8, 2011, at 10:16 AM, Klaus Zimmermann wrote:
>> 
>>> Dear all,
>>> 
>>> I want to evaluate a function on a grid. Right now I have some custom code 
>>> (see the end of this message).
>>> I am now thinking of rewriting this using more PETSC facilities because in 
>>> the future we want to extend the
>>> program to higher dimensions and also unstructured grids.
>>    There is little similarity and little chance of code reuse between using 
>> a structured grid and an unstructured grid at the level of your function 
>> evaluations. So if you truly want to have a structured grid version and 
>> unstructured you should just have different FormFunctions for them.
> This is why I hoped to reuse code for the unstructured version: As long as I 
> can call a method with coordinates and context I am fine.
> I don't really need any solving. Do I still need different FormFunctions?

  NO

> 
>>> As I understand it now there are several
>>> candidates in PETSC for doing this:
>>> 
>>> 1) PF
>>> 2) DALocalFunction
>>    Using the "local" function approach with DMMGSetSNESLocal() is just a way 
>> of hiding the DAVecGetArray() calls from the function code and is good when 
>> you have simple FormFunctions that do no rely on other vectors beside the 
>> usual input and output vectors. In your example below I do not understand 
>> why you have the input and output vectors inside the appctx, usually they 
>> are the input and output arguments to the formfunction set with 
>> SNESSetFunction() or DMMGSetSNES()
> I agree. This is mostly because I didn't understand the concepts so well at 
> the time I wrote this code and one of the reasons why I would like to 
> refactor.
> In my case there should in principle be three output vectors. All the 
> facilities I have seen in petsc only deal with a single output vector. Is 
> this correct?
> Of course there is an obvious mapping, but I would prefer to keep the vectors 
> apart because that way it is easier to deal with the parallel layout.

  You can keep them separate. You can have as many vector inputs and outputs 
you want (it is only the SNES solvers that need exactly one input and one 
output). Sometimes storing several vectors "interlaced" gives 
better performance since it uses the cache's better but that is only an 
optimization.

  If you separate the "iterater" part of the code from the "function" part then 
you can have a different iterator for structured and unstructured grid but 
reuse the "function" part. 
> 
>>> 3) FIAT (?)
>>> 
>>> Could you please advise on what should be used?
>>> An additional problem is that several distinct functions should be 
>>> evaluated at the same time due to the
>>> due to the reuse of intermediate results.
>> 
>>    Are these functions associated with different SNES solvers or is the 
>> composition of these "several distinct functions" that defines the equations 
>> you are solving with PETSc? If the later then you just need to write either 
>> a single function that computes everything or have some smart use of inline 
>> to get good performance even with different functions.
> I am not really using any solving at the moment. Please let me know if you 
> need more detail.
> 
> Thanks again,
> Klaus
> 
>>    Barry
>> 
>>> Any help is appreciated!
>>> Thanks in advance,
>>> Klaus
>>> 
>>> ------------------8<------------------8<------------------8<------------------8<------------------8<--------------
>>> 
>>> PetscErrorCode EvaluatePsiAndGradPsi(AppCtx *user) {
>>>  PetscErrorCode ierr;
>>>  PetscInt i, j, cxs, cxm, cys, cym;
>>>  PetscScalar **lpsi;
>>>  PetscScalar **lGradPsi_x, **lGradPsi_y;
>>>  Vec gc;
>>>  DACoor2d **coors;
>>>  DA cda;
>>>  ierr = DAGetCoordinateDA(user->zGrid,&cda);CHKERRQ(ierr);
>>>  ierr = DAGetCoordinates(user->zGrid,&gc);CHKERRQ(ierr);
>>>  ierr = DAVecGetArray(cda, gc,&coors);CHKERRQ(ierr);
>>>  ierr = DAGetCorners(cda,&cxs,&cys, PETSC_NULL,&cxm,&cym, 
>>> PETSC_NULL);CHKERRQ(ierr);
>>> 
>>>  ierr = DAVecGetArray(user->zGrid, user->psi,&lpsi);CHKERRQ(ierr);
>>>  ierr = DAVecGetArray(user->zGrid, 
>>> user->gradPsi_x,&lGradPsi_x);CHKERRQ(ierr);
>>>  ierr = DAVecGetArray(user->zGrid, 
>>> user->gradPsi_y,&lGradPsi_y);CHKERRQ(ierr);
>>>  for(i=cys; i<cys+cym; i++) {
>>>    for(j=cxs; j<cxs+cxm; j++) {
>>>      PetscReal x = PetscRealPart(coors[i][j].x-coors[i][j].y),
>>>    y = PetscRealPart(coors[i][j].y);
>>>      if(x>=0) {
>>>    ierr = EvaluatePsiAndGradPsi(user, x, y,
>>> &(lpsi[i][j]),
>>> &(lGradPsi_x[i][j]),
>>> &(lGradPsi_y[i][j]));CHKERRQ(ierr);
>>>      }
>>>    }
>>>  }
>>>  ierr = DAVecRestoreArray(user->zGrid, 
>>> user->gradPsi_x,&lGradPsi_x);CHKERRQ(ierr);
>>>  ierr = DAVecRestoreArray(user->zGrid, 
>>> user->gradPsi_y,&lGradPsi_y);CHKERRQ(ierr);
>>>  ierr = DAVecRestoreArray(user->zGrid, user->psi,&lpsi);CHKERRQ(ierr);
>>>  ierr = DAVecRestoreArray(cda, gc,&coors);CHKERRQ(ierr);
>>>  ierr = VecDestroy(gc);CHKERRQ(ierr);
>>>  ierr = DADestroy(cda);CHKERRQ(ierr);
>>>  return 0;
>>> }
> 

Reply via email to