Hi,

I'm trying to implement a new solution algorithm in our code. This involves 
performing some algebraic operations using distributed sparse matrices and 
vectors and then checking convergence by integrating the residual vector 
over the entire system (I'm doing an integration instead of just evaluating 
the norm, because the norm will lead to problems when you change the grid 
size). I can perform the algebraic operations without a problem, but when I 
start looping over cells to integrate the residual vector, I get out of 
bounds errors. From what I've been able to gather from debugging, the loop 
over cells on process 0 seems to be okay, but the loops on other processors 
crash immediately. The algebraic operations consist of the code

      LA::MPI::Vector delta_rhs(p_locally_owned_dofs, p_comm);
      p_system.vmult(delta_rhs,p_local_solution);
      delta_rhs.sadd(-1.0,p_rhs);
      delta_rhs.compress(VectorOperation::insert);
      double res = evaluateResidualNorm(delta_rhs);

where p_system is a sparse matrix and p_local_solution is a distributed 
vector with ghost cells. The evaluateResidualNorm is given by

   double evaluateResidualNorm(LA::MPI::Vector &residual) {
      // Get information on shape functions so we can do integral
      QGauss<2> quad_formula(2);
      FEValues<2> fe_values(p_fe, quad_formula, update_values |
          update_quadrature_points | update_JxW_values);
      const unsigned int dofs_per_cell = p_fe.dofs_per_cell;
      const unsigned int n_q_points = quad_formula.size();
      unsigned int i, q;
      double totalRes = 0.0;
      std::vector<types::global_dof_index>
        local_dof_indices(dofs_per_cell);
      typename DoFHandler<2>::active_cell_iterator cell, endc;
      cell = p_dof.begin_active();
      endc = p_dof.end();
      double dQ, weight;
      for (; cell != endc; ++cell) {
        if (cell->is_locally_owned()) {
          fe_values.reinit(cell);
          cell->get_dof_indices(local_dof_indices);
          for (q=0; q<n_q_points; ++q) {
            for (i=0; i<dofs_per_cell; ++i) {
              //printf("p[%d] (evaluateResidualNorm) Got to 1\n",p_me);
              dQ = fabs(residual[local_dof_indices[i]]);
              //printf("p[%d] (evaluateResidualNorm) Got to 2\n",p_me);
              weight = fe_values.shape_value(i,q)*fe_values.JxW(q);
              totalRes += (weight*dQ);
            }
          }
        }
      }
      totalRes = Utilities::MPI::sum (totalRes, p_comm);
      return totalRes;
    };

and the errors are

terminate called after throwing an instance of 
'dealii::PETScWrappers::internal::VectorRefe
rence::ExcAccessToNonlocalElement'
 what():  
--------------------------------------------------------
An error occurred in line <129> of file 
</people/d3g293/software/dealii-8.4.0/source/lac/pe
tsc_vector_base.cc> in function
    dealii::PETScWrappers::internal::VectorReference::operator 
PetscScalar() const
The violated condition was: 
    (index >= static_cast<size_type>(begin)) && (index < 
static_cast<size_type>(end))
The name and call sequence of the exception was:
    ExcAccessToNonlocalElement (index, begin, end-1)
Additional Information: 
You tried to access element 1172 of a distributed vector, but only elements 
1581 through 16
80 are stored locally and can be accessed.
--------------------------------------------------------

I've used the loop structure in the evaluateResidualNorm call elsewhere in 
the code without difficult. Is there something I need to do to the 
delta_rhs vector after performing algebraic operations before I can safely 
access individual elements? I've run into problems with PETSc matrix-vector 
multiplies in the past where the result vector can have a different 
distribution than the original vector in the MV multiply if you are not 
careful about how the matrix is partitioned.

Bruce

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to