I have to admit that I didn't read your code carefully enough to really
understand how you did the coloring. Regardless, its probably fine: using a
2D approximation for a 2D calculation gets rid of the error in the example
you provided :)
On Thu, Jan 18, 2018 at 8:54 PM, Juan Carlos Araujo
Thanks a lot for looking at this issue so quickly!
No problem, you can use it. That was the idea of sharing the minimal
example! I am always happy to contribute with the library.
So, does this mean that the way I colorize is correct? if there is a
better/simpler way plz let me know.
Is the
I figured it out: we have an optimization that assumes in one particular
place that, if we have eight points, they must be the vertices of a cube.
However, in 2D, we use eight points to place new points on quadrilaterals:
assuming these are the vertices of a cube is wrong. The original
Bruno,
> This says that you have a problem with MPI. Your program died before
>> executing any code in deal. Can you run other MPI codes?
>>
>
Yes, I can run step-40 in parallel. This problem happens only after I added
a DoFRenumbering call in it (step-40.cc originally does not call
Jie,
2018-01-18 14:31 GMT-05:00 Jie Cheng :
> Fatal error in MPI_Init_thread: Other MPI error, error stack:
> MPIR_Init_thread(474):
> MPID_Init(152)...: channel initialization failed
> MPID_Init(426)...: PMI_Get_appnum returned -1
> [cli_0]: write_line error;
Ignore my question about projection. Somehow I thought I remembered that
the projection functions dont support nedelec elements in deal - my bad.
Am Donnerstag, 18. Januar 2018 18:57:06 UTC+1 schrieb Pascal Kraft:
>
> Thanks for your fast reply!
> About your first point: Yes, I currently use a
Thanks for your fast reply!
About your first point: Yes, I currently use a FeSystem composed of two 3D
Nedelec fes.
The second point I agree with you. Setting the additional values should not
be a problem since I know the analytical solution of the problem so this
should not be a problem
Jie,
On Thursday, January 18, 2018 at 11:17:55 AM UTC-5, Jie Cheng wrote:
>
> I found that DoFRenumbering with distributed dof_handler does not work on
> my mac (Mac OS X 10.13.2). My dealii is built with PETSc and Step-40 runs
> fine. But once I added DoFRenumbering::Cuthill_McKee(dof_handler)
Hey there,
I have some familiarity with the transfinite interpolation implementation
and am looking into this now. It looks like, in the transformation, one of
the 'reference cell points' is calculated as {20, 20}: I don't yet see why.
I suspect that we now create a twisted cell (or something
Hi all
I found that DoFRenumbering with distributed dof_handler does not work on
my mac (Mac OS X 10.13.2). My dealii is built with PETSc and Step-40 runs
fine. But once I added DoFRenumbering::Cuthill_McKee(dof_handler)
after dof_handler.distribute_dofs (fe), it gave me runtime error:
10 matches
Mail list logo