Other possible solutions include
* Requiring the user to call a init_dolfin() function before doing
anything (From python this could be done implicitly when doing "import
dolfin")
* Adding calls to init_mpi() in the functions in MPI.h where this is missing.

Benjamin

2014-03-13 14:01 GMT+01:00 Benjamin Kehlet <[email protected]>:
> 2014-03-13 13:57 GMT+01:00 Anders Logg <[email protected]>:
>> On Thu, Mar 13, 2014 at 09:36:42AM +0000, Garth N. Wells wrote:
>>>
>>>
>>> On Thu, 13 Mar, 2014 at 9:23 AM, Johannes Ring <[email protected]>
>>> wrote:
>>> >On Wed, Mar 12, 2014 at 6:30 PM, Jan Blechta
>>> ><[email protected]> wrote:
>>> >> On Wed, 12 Mar 2014 17:19:31 +0100
>>> >> Benjamin Kehlet <[email protected]> wrote:
>>> >>
>>> >>> Hello!
>>> >>>
>>> >>> This code snippet
>>> >>>-----------------------------------------------------------------------
>>> >>> from dolfin import *
>>> >>>
>>> >>> m = Mesh()
>>> >>>
>>> >>> editor = MeshEditor()
>>> >>> editor.open(m, 2, 2)
>>> >>> editor.init_vertices(3)
>>> >>> editor.add_vertex(0, Point(.0, .0))
>>> >>> editor.add_vertex(1, Point(1., .0))
>>> >>> editor.add_vertex(2, Point(0., 1.))
>>> >>>
>>> >>> editor.init_cells(1)
>>> >>> editor.add_cell(0, 0, 1, 2)
>>> >>> editor.close()
>>> >>>
>>> >>> print MeshQuality.radius_ratio_min_max(m)
>>> >>>-----------------------------------------------------------------------
>>> >>>
>>> >>> gives this error (when Dolfin is built with MPI support)
>>> >>
>>> >> Can't reproduce.
>>> >
>>> >I got the same error as Benjamin.
>>>
>>> I get the error too.
>>>
>>> Initialising MPI is a long-standing issue that we've been dodging.
>>> The solution in other libraries is that the user makes a call at the
>>> start of a program to initialise MPI. We have a sprinkling of
>>> behind-the-scenes initialisation of MPI. It's not ideal to have
>>> calls all over the place the initialise MPI. I'm not sure what the
>>> best solution is to balance performance and simplicity.
>>
>> Wouldn't it work to say that all classes that are parallel-aware need
>> to initialize MPI? In this case the Mesh class. The cost should be
>> virtually zero. I don't know what the cost is for calling
>> MPI_Initialized() but we could add an extra flag in DOLFIN which is
>> set when SubSystemsManager::init_mpi() is called so that a call to
>> MPI_Initialized is not necessary when we know that we ourselves have
>> actually initialized MPI explicitly before.
>
> I was thinking along the same lines, namely to add a call to
> init_mpi() in the constructor of Variable.
>
> Benjamin
>
>>
>> --
>> Anders
>> _______________________________________________
>> fenics mailing list
>> [email protected]
>> http://fenicsproject.org/mailman/listinfo/fenics
_______________________________________________
fenics mailing list
[email protected]
http://fenicsproject.org/mailman/listinfo/fenics

Reply via email to