> Also: Ben, is current_local_solution a serial vector? It certainly
> looks that way to me, but I've always been a little confused by the
> solution/current_local_solution divide, so maybe I'm missing some sort
> of PETSc magic under the hood.
In the beginning...
I understood a little about parallel computing and less about PETSc.
"solution" is a parallel vector, and "current_local_solution" is a
serialized version of a *subset* of "solution". Thus it is full-sized, but
only a fraction of the entries are correctly populated from remote
processors.
Now, let "global" and "local" be, uh, global and local vectors,
respectively. Calling
global.localize(local);
Is kinda like an allreduce - at the end each processor has a complete
picture of the global vector stored in local.
global.localize(local, index_list); // this is what is done in
// System::update()
Copies only the subset of "global" to "local" as specified by "index_list"
I'm not sure how PETSc is going about implementing these sends, but I bet in
the former case the underlying memory allocations when you get to 100s of
processors is what you are seeing.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Where I want this to go is something like the following:
DistributedVector<Real> vec(n_global, n_local, array_of_remote_indices);
Where "vec" is a vector which stores n_local entries, and also has "ghost
storage" for array_of_remote_indices.size() entries.
You then go about your business, computing with "vec" and performing
operations which update the local entries, and when you need to get
synchronized copies of the remote values you call something like
vec.localize();
This would effectively eliminate current_local_solution -- everything would
be done in-place from solution.
-Ben
-------------------------------------------------------------------------
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://sourceforge.net/services/buy/index.php
_______________________________________________
Libmesh-devel mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/libmesh-devel