Question #700611 on Yade changed:
https://answers.launchpad.net/yade/+question/700611
Status: Open => Answered
Bruno Chareyre proposed the following answer:
Hi,
This has 0% chances to work.
MPI uses a domain decomposition with one yade instance running each subdomain.
What happens in your script is that each yade instance has its own 2PFV engine
trying to triangulate+solve a flow problem in a subset of particles with no
walls, and with no knowledge of what happens in the other subdomains (hence
inconsistent triangulations, no connectivity of fluxes accross subdomains,
etc.).
Worth than that, particles can be exchanged between subdomains, so 2PFV may
look for positions of particles which are no longer in the scene.
You need to think about what really happens with domain decomposition if you
want to understand what can/cannot work.
In this case it would need a specific implementation of 2PFV in order support
MPI.
Bruno
--
You received this question notification because your team yade-users is
an answer contact for Yade.
_______________________________________________
Mailing list: https://launchpad.net/~yade-users
Post to : [email protected]
Unsubscribe : https://launchpad.net/~yade-users
More help : https://help.launchpad.net/ListHelp