The fix is in master.

Jan


On Thu, 2 Apr 2015 15:09:19 +0200
Jan Blechta <[email protected]> wrote:

> So pushing into next. Expect it getting to master in few days.
> 
> On Thu, 2 Apr 2015 14:55:38 +0200
> Corrado Maurini <[email protected]> wrote:
> 
> > Nice, it works! (or at least it compiles)
> 
> You could try running timing demo in parallel.
> 
> Jan
> 
> > 
> > Thanks a lot.
> > 
> > Corrado
> > 
> > Corrado Maurini
> > [email protected]
> > 
> > 
> > 
> > > On 02 Apr 2015, at 14:27, Jan Blechta <[email protected]>
> > > wrote:
> > > 
> > > Please, try again with DOLFIN
> > > 94ba11b4f63c83590d1dce2583754fe85ae6cb08
> > > 
> > > Jan
> > > 
> > > 
> > > On Thu, 2 Apr 2015 11:46:12 +0200
> > > Jan Blechta <[email protected]> wrote:
> > > 
> > >> Hi Corrado,
> > >> 
> > >> I'm looking into this and it seems to me that there is something
> > >> non-standard in MPI_Op definition in your MPI implementation.
> > >> Compiler understands MPI_MAX's type as <anonymous enum> while it
> > >> should be MPI_Op so that 
> > >> 
> > >> template<>
> > >>   Table dolfin::MPI::all_reduce(MPI_Comm, const Table&, MPI_Op);
> > >> 
> > >> specialization is used rather than
> > >> 
> > >>  template<typename T, typename X>
> > >>    T dolfin::MPI::all_reduce(MPI_Comm comm, const T& value, X op)
> > >> 
> > >> which is expected to fail for T=Table, X=MPI_Op.
> > >> 
> > >> Could you point me to the headers with declaration of MPI_Op,
> > >> MPI_MAX?
> > >> 
> > >> Jan
> > >> 
> > >> 
> > >> On Wed, 1 Apr 2015 22:50:12 +0200
> > >> Corrado Maurini <[email protected]> wrote:
> > >> 
> > >>> Hi all,
> > >>> 
> > >>> I got the error below when building dolfin-dev
> > >>> (git:05296b363ece99fca64bd1b9312a5c2ecefe0777) on a SGI shared
> > >>> memory machine using mpt/2.11 as mpi library:
> > >>> 
> > >>> You find here the full build.log: http://pastebin.com/jhEW1WHd
> > >>> <http://pastebin.com/jhEW1WHd>
> > >>> 
> > >>> FYI, I use hashdist to build, but I think this is completely
> > >>> transparent. I can build without problem fenics 1.5. 
> > >>> 
> > >>> Can someone help? I think that the recent changes introduced
> > >>> some compatibility issues.
> > >>> 
> > >>> 
> > >>> 2015/04/01 22:30:03 - INFO: [package:run_job] [ 65%] Building
> > >>> CXX object dolfin/CMakeFiles/dolfin.dir/la/uBLASVector.cpp.o
> > >>> 2015/04/01 22:30:08 - INFO: [package:run_job] [ 65%] Building
> > >>> CXX object dolfin/CMakeFiles/dolfin.dir/log/Event.cpp.o
> > >>> 2015/04/01 22:30:08 - INFO: [package:run_job] [ 65%] Building
> > >>> CXX object dolfin/CMakeFiles/dolfin.dir/log/LogManager.cpp.o
> > >>> 2015/04/01 22:30:09
> > >>> - INFO: [package:run_job] [ 67%] Building CXX object
> > >>> dolfin/CMakeFiles/dolfin.dir/log/LogStream.cpp.o 2015/04/01
> > >>> 22:30:12
> > >>> - INFO: [package:run_job] [ 67%] Building CXX object
> > >>> dolfin/CMakeFiles/dolfin.dir/log/Logger.cpp.o 2015/04/01
> > >>> 22:30:15
> > >>> - INFO: [package:run_job] In file included
> > >>> from 
> > >>> /opt/dev/libs/fenics/hashdist-builds/tmp/dolfin-jt5cc6fobjg7/dolfin/log/Logger.cpp:45:0:
> > >>> 2015/04/01 22:30:15 - INFO:
> > >>> [package:run_job] 
> > >>> /opt/dev/libs/fenics/hashdist-builds/tmp/dolfin-jt5cc6fobjg7/dolfin/common/MPI.h:
> > >>> In instantiation of 'static MPI_Datatype dolfin::MPI::mpi_type()
> > >>> [with T = dol fin::Table; MPI_Datatype = unsigned int]':
> > >>> 2015/04/01 22:30:15 - INFO:
> > >>> [package:run_job] 
> > >>> /opt/dev/libs/fenics/hashdist-builds/tmp/dolfin-jt5cc6fobjg7/dolfin/common/MPI.h:613:64:
> > >>> required from 'static T dolfin::MPI::all_reduce(MPI_Comm, const
> > >>> T&, X ) [with T = dolfin::Table; X = <anonymous enum>; MPI_Comm
> > >>> = unsigned int]' 2015/04/01 22:30:15 - INFO:
> > >>> [package:run_job] 
> > >>> /opt/dev/libs/fenics/hashdist-builds/tmp/dolfin-jt5cc6fobjg7/dolfin/common/MPI.h:623:43:
> > >>> required from 'static T dolfin::MPI::max(MPI_Comm, const T&)
> > >>> [with T = dolfin::Table; MPI_Comm = unsigned int]' 2015/04/01
> > >>> 22:30:15 - INFO:
> > >>> [package:run_job] 
> > >>> /opt/dev/libs/fenics/hashdist-builds/tmp/dolfin-jt5cc6fobjg7/dolfin/log/Logger.cpp:327:43:
> > >>> required from here 2015/04/01 22:30:15 - INFO:
> > >>> [package:run_job] 
> > >>> /opt/dev/libs/fenics/hashdist-builds/tmp/dolfin-jt5cc6fobjg7/dolfin/common/MPI.h:230:7:
> > >>> error: static assertion failed: Unknown MPI type 2015/04/01
> > >>> 22:30:15
> > >>> - INFO: [package:run_job]
> > >>> static_assert(dependent_false<T>::value, "Unknown MPI type");
> > >>> 2015/04/01 22:30:15 - INFO: [package:run_job]        ^
> > >>> 2015/04/01 22:30:16 - INFO: [package:run_job] gmake[2]: ***
> > >>> [dolfin/CMakeFiles/dolfin.dir/log/Logger.cpp.o] Error 1
> > >>> 2015/04/01 22:30:16 - INFO: [package:run_job] gmake[1]: ***
> > >>> [dolfin/CMakeFiles/dolfin.dir/all] Error 2 2015/04/01 22:30:16 -
> > >>> INFO: [package:run_job] gmake: *** [all] Error 2 2015/04/01
> > >>> 22:30:16
> > >>> - ERROR: [package:run_job] Command '[u'/bin/bash',
> > >>> '_hashdist/build.sh']' returned non-zero exit status 2
> > >>> 2015/04/01 22:30:16 - ERROR: [package:run_job] command failed
> > >>> (code=2); raising
> > >>> 
> > >>> Corrado
> > >>> 
> > >>> 
> > >>> 
> > >> 
> > >> _______________________________________________
> > >> fenics-support mailing list
> > >> [email protected]
> > >> http://fenicsproject.org/mailman/listinfo/fenics-support
> > > 
> 
> _______________________________________________
> fenics-support mailing list
> [email protected]
> http://fenicsproject.org/mailman/listinfo/fenics-support

_______________________________________________
fenics-support mailing list
[email protected]
http://fenicsproject.org/mailman/listinfo/fenics-support

Reply via email to