Wienand Drenth wrote: > Hello Barry, > > Thank you for that. > > Just another question. As I wrote in my first email, in the current > code, we utilize a local non-PetSc arrays and using VecPlaceArray we > "give" this array to PetSc vectors to do the KSPSolve. Afterwards, we > can just continue with our local non-PetSc arrays. If I understand you > correctly, and for my knowledge, this approach will not be possible in > a parallel setting? > > When I do, with for example two processors, and with local array being > blocal = 1, 2, .... , 10 > then for the zeroth processor I have also values 1, 2, ... , 10 and > not just half (i.e., 1,2,3,4,5,0,0,0,0,0). > for the first processor I have only part of the values, but they start > with the first entry of my array, and not half-way: > 0,0,0,0,0, 1,2,3,4,5 instead of 0,0,0,0,0, 6,7,8,9,10
If it is this simple, you could still use VecPlaceArray, but you would be responsible for updating ghost values (of your arrays, KSPSolve will only put the solution in the contiguous owned segment). In 2D or 3D, the owned segment that you want to "give" to the KSP is likely to not be contiguous. BUT, you should just make a copy, it will not be a significant amount of memory or time. Look at VecScatterCreateToAll(), this can be used to update the copy that the rest of your code works with. Jed -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 261 bytes Desc: OpenPGP digital signature URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20091109/34b707fc/attachment.pgp>
