On Wed, Nov 27, 2013 at 12:43 PM, Geoffrey Irving <[email protected]> wrote:
> Currently, FE fields on a DMPlex are created with the function > > DMPlexProjectFunction > > This function has three issues: > > 1. It doesn't take a void* argument, and therefore requires global > variables if you need to pass in additional data. > I am not opposed to DMPlex functions taking contexts. However, my question is: how is intended to receive this context? So far, I am unconvinced that PetscFE needs an outside context. It just seems like bad design to me. The original intention is to pass evaluation data in as a mesh field. How does this break down? > 2. It operates one quadrature point and a time, so is inefficient in > situations where evaluation overhead can be amortized across multiple > points (especially from scripts). For the workhorse FE routines this > is inconvenient but acceptable since the goal is probably OpenCL, but > it'd would be nice to have access to python and numpy in the setup > phase. > I think this is oversimplified. DMPlex does operate in batches of quadrature points, in fact in batches of cells (although the batch size is 1 now). It is PetscDualSpace that calls things one quad point at a time. I would think that he Python interface would come in at the level of the PetscDualSpace evaluator. How does this differ from what you had planned? > 3. It doesn't pass a cell argument, so data in an existing mesh fields > requires a hierarchy traversal to access. > I think I do not understand something fundamental about how you are using this function. This is intended to be an orthogonal projection of the function f onto the finite element space V. Why would we need a cell index or additional information? Is it because the function f you have is not analytic, but another mesh field? I will think about optimizing this case. Matt > I'm going to make an alternate version of DMPlexProjectFunction that > addresses these issues. If you think it's too early to know the best > way to do this in core petsc, I can put the routine in my code for now > and we can revisit migrating it into petsc later. I'm fine either > way. Concretely, these issues are solved by > > 1. Adding a void* argument. > 2. Batching quadrature point evaluation. > 3. Passing the cell index. > > so there aren't a lot of choices to get wrong. Really the only choice > is who chooses the batch size. > > Geoffrey > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener
