On Fri, Oct 4, 2013 at 12:57 PM, Åsmund Ervik <[email protected]> wrote:
> Barry, > > Thanks for the quick answer. > > Good to hear that I can use the DMDA framework for all variables. Should > I put all scalars (e.g. pressure, level set function, etc) in the same DA, > or should I keep a distinct one for the pressure (where I want to use > multigrid)? > Separate variables which are solved for. > The reason I was unsure is that I can't seem to find an example which > manipulates the local array from a DA. I would've guessed there was > something like > > real, dimension(:,:,:) u,v,w > call DMDAGetLocalArray(da,u,v,w) > ! Some computations looping over local i,j,k that manipulate u,v,w > call DMDARestoreLocalArray(da,u,v,w) > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMDAVecGetArray.html > On the BCs for velocity, I would like to support several options. To get > the code up and running I would be OK with just periodic, but I would > eventually like to support full slip and no slip, and preferably a mix of > these for the different faces. Perhaps also inflow and outflow. I don't > need (physical) pressure BCs though. Would this complicate things much? > Both periodic and ghost cells are supported. Imposing Dirichlet conditions on an unknown is also easy. Thanks, Matt > I understand the point about the velocity i,j,k lining up, this is how we > do it currently. > > Åsmund > > Sent from my VT-102 > > Barry Smith <[email protected]> skrev: > > Asmund, > > You can use the DMDA to manage the layout of your velocity variables > as well as the pressure variables. You will have two DMDA, one that manages > the cell-centered pressure variables (this is created with the dof argument > of 1) and one that handles the velocities (that is created with the dof > argument of 3) on the "faces". Then you can have a ghosted representation > of the velocities from which you compute the right hand side for your > pressure equation. > > What kind of boundary conditions do you have for the velocities? This > will determine exactly how to create the DMDA for the velocities. > > Note the though the x, y, and z velocities are physically associated > with the three sets of faces of the cells and thus not collocated on the > physical domain you can stack the three of them up at the same i,j,k mesh > point of the DMDA vector. Depending on your boundary conditions there may > be less pressure variables then velocity variables in each direction of the > grid; to make the two different DMDA "line up" you can just have an extra > "slab" of pressure variables in each direction that are never computed on. > It's easy to draw a picture in 2d of the stagger grid to see what I mean. > > > Barry > > On Oct 4, 2013, at 8:35 AM, Åsmund Ervik <[email protected]> wrote: > > > Dear all, > > > > We have a two-phase incompressible Navier-Stokes solver written in > > Fortran where we use PETSc for solving the pressure Poisson equation. > > Since both PETSc and parallelism was an afterthought to this code, it > > doesn't scale well at all, so I am tasked with re-writing the whole > > thing now. Before I commit any fresh mistakes in the design of this new > > code, I will ask for input on my "design decisions" so far. > > > > I want to do domain decomposition on a structured 3D grid. I've been > > trying to wrap my head around the DM and DMDA parts of PETSc, and as far > > as I understand, these will help me solve the pressure Poisson equation > > on a decomposed domain (and with geometric multigrid via Galerkin) > > fairly easily. > > > > The tricky part, then; it seems that I must handle "the rest" of the > > domain decomposition myself. Omitting some detail, this means my code > will: > > > > * set up parameters, initial conditions, etc. > > * decompose my array for the velocity field into several parts, > > * time loop: > > * communicate e.g. the velocity field on the boundaries > > * each mpi worker will calculate on the local domain the > > intermediate velocity field, the rhs to the Poisson equation > > and set up the correct sparse matrix > > * PETSc will solve the Poisson equation to give me the pressure > > * each mpi worker will then calculate the updated > > divergence-free velocity field > > * each mpi worker will calculate the time step (CFL condition), > > and we choose the lowest dt among all nodes > > * end time loop > > > > Have I misunderstood anything here? At first I thought the DMDA would > > give me the framework for decomposing the velocity field, handling > > communication of the ghost values at the boundaries etc, but it seems > > this is not the case? > > > > One further question: is it a good idea to set up the DMDA letting PETSc > > decide the number of processors in each direction, and then using this > > same partition for the rest of my code? > > > > If there are any unclear details, please ask. If it matters, I am using > > the level-set and ghost-fluid methods, so the matrix for my Poisson > > equation must be recomputed each time step. I believe this is the same > > situation as Michele Rosso who posted on this list recently. > > > > Best regards, > > Åsmund Ervik > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener
