On 1/10/21 12:43 PM, Konrad Simon wrote:
).
Using one MPI rank is fine but if I use more I get the error
"An error occurred in line <666> of file
</home/ksimon/lib/dealii-9.2.0-src/source/lac/trilinos_vector.cc> in function
dealii::TrilinosScalar
dealii::TrilinosWrappers::MPI::Vector::operator()(dealii::TrilinosWrappers::MPI::Vector::size_type)
const
The violated condition was:
false
Additional information:
You tried to access element 3023 of a distributed vector, but this element
is not stored on the current processor. Note: There are 4456 elements stored
on the current processor from within the range 5544 through 9999 but Trilinos
vectors need not store contiguous ranges on each processor, and not every
element in this range may in fact be stored locally."
Now, I know what that means. Is there anything I can do? A workaround?
I suspect you are passing a fully distributed vector to that function, but it
needs to read ghost elements of the vector. Have you tried copying the vector
into a locally_relevant vector, and passing that to the function in question?
Best
W.
--
------------------------------------------------------------------------
Wolfgang Bangerth email: [email protected]
www: http://www.math.colostate.edu/~bangerth/
--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see
https://groups.google.com/d/forum/dealii?hl=en
---
You received this message because you are subscribed to the Google Groups "deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To view this discussion on the web visit
https://groups.google.com/d/msgid/dealii/07639eff-1fd9-8173-3597-c468af2e7523%40colostate.edu.