[petsc-users] creating a global vector of one particular field from a global vector created from multiple dof dmda

2013-08-08 Thread Bishesh Khanal
Hi all, Let's say I have two DMDAs with identical size but with different dofs. E.g. da1 with dof=4; da2 with dof=1; I have global vectors associated with each one of them, say, gv1 and gv2 respectively. How can I copy/scatter values of one particular field from gv1 to gv2 ? Looking at the

Re: [petsc-users] creating a global vector of one particular field from a global vector created from multiple dof dmda

2013-08-08 Thread Matthew Knepley
On Thu, Aug 8, 2013 at 6:32 AM, Bishesh Khanal bishes...@gmail.com wrote: Hi all, Let's say I have two DMDAs with identical size but with different dofs. E.g. da1 with dof=4; da2 with dof=1; I have global vectors associated with each one of them, say, gv1 and gv2 respectively. How can I

[petsc-users] implementation of multi-level grid in petsc

2013-08-08 Thread Roc Wang
Hi, I am working on multi-level grid for Poisson equation. I need to refine some sub-region in the computational domain. To this, I plan to build some boxes (patches) based on the coarsest level. I am using DM to manage the data. I found there is a new function DMPatachCreate() in the

Re: [petsc-users] implementation of multi-level grid in petsc

2013-08-08 Thread Matthew Knepley
On Thu, Aug 8, 2013 at 1:29 PM, Roc Wang pengxw...@hotmail.com wrote: Hi, I am working on multi-level grid for Poisson equation. I need to refine some sub-region in the computational domain. To this, I plan to build some boxes (patches) based on the coarsest level. I am using DM to

Re: [petsc-users] implementation of multi-level grid in petsc

2013-08-08 Thread Roc Wang
Thanks Mat, I tried Chombo for implementing AMR but not tried SAMRAI yet. Chombo can do AMR, but it seems the data structure is quite complicated for customizing usage. What I want to do with petsc is to compose a simple home-made like blocked multi-level grid, though it is not

[petsc-users] VecGhost memory layout

2013-08-08 Thread Mohammad Mirzadeh
Hi guys, I'm running into a bug that has made me question my understanding of memory layout in VecGhost. First, I remember reading somewhere before (in the manual or mailing list which I cannot find now) that the way they are internally organized is all local values followed by all ghost values.

Re: [petsc-users] implementation of multi-level grid in petsc

2013-08-08 Thread Mark F. Adams
On Aug 8, 2013, at 3:32 PM, Roc Wang pengxw...@hotmail.com wrote: Thanks Mat, I tried Chombo for implementing AMR but not tried SAMRAI yet. Chombo can do AMR, but it seems the data structure is quite complicated for customizing usage. What I want to do with petsc is to compose a

Re: [petsc-users] VecGhost memory layout

2013-08-08 Thread Barry Smith
On Aug 8, 2013, at 2:56 PM, Mohammad Mirzadeh mirza...@gmail.com wrote: Hi guys, I'm running into a bug that has made me question my understanding of memory layout in VecGhost. First, I remember reading somewhere before (in the manual or mailing list which I cannot find now) that the

Re: [petsc-users] implementation of multi-level grid in petsc

2013-08-08 Thread Mohammad Mirzadeh
How big of an application are you looking into? If you are thinking in the range of couple of 10M grid points on couple of hundred processors, then I'd say the simplest approach is to create grid in serial and then use PETSc's interface to ParMetis to handle partitioning. I did this with my

Re: [petsc-users] VecGhost memory layout

2013-08-08 Thread Mohammad Mirzadeh
Awesome. Thanks Barry for the quick response. On Thu, Aug 8, 2013 at 1:42 PM, Barry Smith bsm...@mcs.anl.gov wrote: On Aug 8, 2013, at 2:56 PM, Mohammad Mirzadeh mirza...@gmail.com wrote: Hi guys, I'm running into a bug that has made me question my understanding of memory layout in

Re: [petsc-users] KSP solver for single process

2013-08-08 Thread Alan
Dear Dr. Smith, I sincerely appreciate your valuable answers. My KSP Poisson solver has been significantly speed up with your help. Here, I wonder what should I do extra to employ geometric MG for non-uniform Cartesian mesh. I suppose the DMDA won't automatically generate the coarse grid

Re: [petsc-users] KSP solver for single process

2013-08-08 Thread Barry Smith
On Aug 8, 2013, at 6:16 PM, Alan zhenglun@gmail.com wrote: Dear Dr. Smith, I sincerely appreciate your valuable answers. My KSP Poisson solver has been significantly speed up with your help. Here, I wonder what should I do extra to employ geometric MG for non-uniform Cartesian mesh.