On Wed, 26 Feb 2014 15:50:32 +0100
Heinz Zorn <[email protected]> wrote:
> Running the given code gives the attached result. Here is the code
> once more (the first time it was hidden in the link below, sorry).
There is missing Function.update() somewhere in DirichletBC
implementation. For a workaround, see the code below. I'll report
a bug.
>
> from dolfin import *
>
> mesh = UnitCubeMesh( 10, 10, 10 )
>
> V = VectorFunctionSpace( mesh, "CG", 1 )
>
> u0 = Expression(("x[0]","0","0"))
> u0 = project( u0, V )
u0.update()
Jan
>
> def u0_boundary(x, on_boundary):
> return on_boundary
> bc = DirichletBC(V, u0, u0_boundary)
>
> def sigmaIso( u, lmbda, mu ):
> return 2*mu*sym(grad(u))+lmbda*tr(grad(u))*Identity(u.cell().d)
>
> E = 100
> nu = 0.3
> mu = E/(2.0*(1.0+nu))
> lmbda = E*nu/((1.0+nu)*(1.0-2.0*nu))
>
> u = TrialFunction( V )
> v = TestFunction( V )
> pde = inner( sigmaIso(u,lmbda,mu), sym(grad(v)) )*dx
> a, L = system(pde)
> u = Function( V )
> problem = LinearVariationalProblem(a, L, u, bc)
> solver = LinearVariationalSolver(problem)
> solver.parameters["linear_solver"] = "lu"
> solver.parameters["preconditioner"] = "none"
>
> solver.solve()
>
> fileu = File( "u.pvd" )
> fileu << u
>
> Am 26.02.2014 15:36, schrieb Jan Blechta:
> > On Wed, 26 Feb 2014 15:25:01 +0100
> > Heinz Zorn <[email protected]> wrote:
> >
> >> Hello,
> >>
> >> even using LU as linear solver does not give the right solution. It
> >> looks like the Dirichlet condition is set to zero at the borders of
> >> the mesh partition.
> > And how should we reproduce it?
> >
> > Jan
> >
> >> Heinz
> >>
> >> Am 26.02.2014 15:05, schrieb Jan Blechta:
> >>> Hi,
> >>>
> >>> I encountered crashes (segfaults or PETSc errors 76 or 77) when
> >>> using hypre with OpenMPI 1.4.3 supplied with Ubuntu Precise. The
> >>> recompilation of the whole stack of libraries with OpenMPI 1.6.5
> >>> solved the issue.
> >>>
> >>> Try switching to another preconditioner to check the hypothesis.
> >>>
> >>> Jan
> >>>
> >>>
> >>> On Wed, 26 Feb 2014 12:15:35 +0100
> >>> Heinz Zorn <[email protected]> wrote:
> >>>
> >>>> Hello everybody,
> >>>>
> >>>> I have got a problem with the attached code when it is run in
> >>>> parrallel using mpirun. The more processes I use, the more often
> >>>> it crashes with the message:
> >>>>
> >>>> Traceback (most recent call last):
> >>>> File "test.py", line 32, in <module>
> >>>> solver.solve()
> >>>> RuntimeError:
> >>>>
> >>>> ***
> >>>> -------------------------------------------------------------------------
> >>>> *** DOLFIN encountered an error. If you are not able to resolve
> >>>> this issue *** using the information listed below, you can ask
> >>>> for help at ***
> >>>> *** [email protected]
> >>>> ***
> >>>> *** Remember to include the error message listed below and, if
> >>>> possible, *** include a *minimal* running example to reproduce
> >>>> the error. ***
> >>>> ***
> >>>> -------------------------------------------------------------------------
> >>>> *** Error: Unable to successfully call PETSc function
> >>>> 'KSPSolve'. *** Reason: PETSc error code is: 76.
> >>>> *** Where: This error was encountered inside
> >>>> /build/buildd/dolfin-1.3.0+dfsg/dolfin/la/PETScKrylovSolver.cpp.
> >>>> *** Process: 11
> >>>> ***
> >>>> *** DOLFIN version: 1.3.0
> >>>> *** Git changeset: unknown
> >>>> ***
> >>>> -------------------------------------------------------------------------
> >>>> Using only few processes the programm terminates properly, but
> >>>> the results are obviously not correct. The installation is the
> >>>> ppa installation on a compute server running ubuntu 13.10
> >>>> server. It seems that commenting out line 8 and using the
> >>>> Expression to define the boundary condition solves the problem.
> >>>>
> >>>> Please tell me if any further information is needed or if I
> >>>> should post this problem anywhere else.
> >>>>
> >>>> Thanks in advance,
> >>>> Heinz Zorn
> >>>>
> >>
_______________________________________________
fenics mailing list
[email protected]
http://fenicsproject.org/mailman/listinfo/fenics