On Tue, Sep 15, 2015 at 3:55 AM, Stephan Herb <
inf74...@stud.uni-stuttgart.de> wrote:

> Thanks Julian for addressing this problem; although I changed it to the
> correct version nothing changed (well, now only the displacements for
> the local nodes gets printed instead of all nodes^^).
> I came up with another issue: If you run the example code 4 of "Systems
> of Equations" (Linear Elastic Cantilever) with the command "mpirun -np 2
> ./example-opt" one gets "-nan" as solutions. When you export it for
> example as vtu files theone  part cannot be loaded in a visualization
> program like paraview and the other part can be loaded but only contains
> zeros...


>From your config.log file, I confirmed that you aren't using PETSc, and the
sparse Eigen solvers are being used instead.

Since the Eigen sparse solvers are serial, you can't use them with mpirun.
I confirmed the result you described by running:

mpirun -np 2 ./example-opt --use-eigen

That is, half of the solution consists of NaNs while the other half is all
zeros when you do this:
https://drive.google.com/file/d/0B9BK7pg8se_ibkdpckxpVFI1LWM/view?usp=sharing

So, your fix is to either run serially or install PETSc and make sure that
libmesh configures and builds against it.

We should probably try to prevent this type of configuration from getting
run by throwing an error if the Eigen sparse solvers are invoked when
n_processors() > 1.

@roystgnr any preferences for how to do this?

-- 
John
------------------------------------------------------------------------------
_______________________________________________
Libmesh-users mailing list
Libmesh-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/libmesh-users

Reply via email to