Hi Zhidong,

I am happy to help!

Unfortunately, this is a part of deal.II that has some older code that
has accumulated some inconsistencies over time. Both inconsistencies
stem from there, previously, being a third class PETScWrappers::Vector
which was a sequential PETSc vector: both PETScWrappers::Vector and
PETScWrappers::MPI::Vector inherited from PETScWrappers::VectorBase.
Here VectorBase contained most of the functionality that didn't use or
need MPI objects.

1. I don't think that there is a workaround - the problem with that
function is that one could pass in a serial PETSc vector which doesn't
make any sense. Its not quite equivalent, but the closest replacement
is to just create a MPI::Vector and then extract the Vec rather than
creating a Vec and then putting it in an MPI::Vector.
2. This is, mostly, a consequence of the historical design I mentioned
above: the end result is confusing because we have not completely
converted this class. Your understanding is correct except that
MPI::Vector is always a wrapper of VectorBase *where the Vec is a
parallel-distributed vector*.

Does this make sense?

Thanks,
David

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/CABrTbYRvYZ3MxnN-Bz4F_Pryr5SKapCVEXyeN5X0GanycCNG2w%40mail.gmail.com.

Reply via email to