Your message dated Sun, 23 Sep 2018 20:35:47 +0000
with message-id <[email protected]>
and subject line Bug#909407: fixed in dolfin 2018.1.0.post1-12
has caused the Debian Bug report #909407,
regarding pybind11 breaks dolfin autopkgtest
to be marked as done.

This means that you claim that the problem has been dealt with.
If this is not the case it is now your responsibility to reopen the
Bug report if necessary, and/or fix the problem forthwith.

(NB: If you are a system administrator and have no idea what this
message is talking about, this may indicate a serious mail system
misconfiguration somewhere. Please contact [email protected]
immediately.)


-- 
909407: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=909407
Debian Bug Tracking System
Contact [email protected] with problems
--- Begin Message ---
Source: pybind11, dolfin
Control: found -1 pybind11/2.2.4-1
Control: found -1 dolfin/2018.1.0.post1-10
X-Debbugs-CC: [email protected]
User: [email protected]
Usertags: breaks needs-update

Dear maintainers,

With a recent upload of pybind11 the autopkgtest of dolfin fails in
testing when that autopkgtest is run with the binary packages of
pybind11 from unstable. It passes when run with only packages from
testing. In tabular form:
                       pass            fail
pybind11               from testing    2.2.4-1
dolfin                 from testing    2018.1.0.post1-10
all others             from testing    from testing

I copied some of the output at the bottom of this report.

Looking at the changelog of dolfin 2018.1.0.post1-11, I fear that a
versioned depends or a versioned breaks is missing somewhere. Note that
the Debian migration software considers those to determine if packages
need to be tested together from unstable. If dolfin can't determine the
upper version of pybind11 beforehand, a versioned breaks in pybind11
helps the migration software to use the proper version of dolfin during
dolfin's autopkgtesting. If dolfin 2018.1.0.post1-11 can migrate without
the pybind11 2.2.4-1, you could decide to ignore this bug as once dolfin
migrates, the test that is retried daily will be run with that version.

Currently this regression is contributing to the delay of the migration
of pybind11 to testing [1]. Due to the nature of this issue, I filed
this bug report against both packages. Can you please investigate the
situation and reassign the bug to the right package? If needed, please
change the bug's severity.

More information about this bug and the reason for filing it can be found on
https://wiki.debian.org/ContinuousIntegration/RegressionEmailInformation

Paul

[1] https://qa.debian.org/excuses.php?package=pybind11

https://ci.debian.net/data/autopkgtest/testing/amd64/d/dolfin/1036561/log.gz

=================================== FAILURES
===================================

________________________ test_compile_extension_module
_________________________

    @skip_if_not_PETSc
    def test_compile_extension_module():
    
        # This test should do basically the same as the docstring of
the
        # compile_extension_module function in compilemodule.py.
Remember
        # to update the docstring if the test is modified!
                  from testing
        from numpy import arange, exp
        code = """=================================== FAILURES
===================================
________________________ test_compile_extension_module
_________________________

        [100%]    @skip_if_not_PETSc
    def test_compile_extension_module():
    
        # This test should do basically the same as the docstring of
the
        # compile_extension_module function in compilemodule.py.
Remember
        # to update the docstring if the test is modified!
    
        from numpy import arange, exp
        code = """
          #include <pybind11/pybind11.h>
    
          #include <petscvec.h>
          #include <dolfin/la/PETScVector.h>
    
          void PETSc_exp(std::shared_ptr<dolfin::PETScVector> vec)


=================================== FAILURES
===================================
________________________ test_compile_extension_module
_________________________

    @skip_if_not_PETSc
    def test_compile_extension_module():
    
        # This test should do basically the same as the docstring of
the
        # compile_extension_module function in compilemodule.py.
Remember
        # to update the docstring if the test is modified!
    
        from numpy import arange, exp
        code = """
          #include <pybind11/pybind11.h>
    
          #include <petscvec.h>
          #include <dolfin/la/PETScVector.h>
    
          void PETSc_exp(std::shared_ptr<dolfin::PETScVector> vec)
          {
            Vec x = vec->vec();
            assert(x);
            VecExp(x);
          }
    
        PYBIND11_MODULE(SIGNATURE, m)
        {
          m.def("PETSc_exp", &PETSc_exp);
        }
        """
    
        ext_module = compile_cpp_code(code)
    
        vec = PETScVector(MPI.comm_world, 10)
        np_vec = vec.get_local()
        np_vec[:] = arange(len(np_vec))
        vec.set_local(np_vec)
>       ext_module.PETSc_exp(vec)
          {E       TypeError: PETSc_exp(): incompatible
function arguments. The following argument types are supported:
E           1. (arg0: dolfin::PETScVector) -> None
E       
E       Invoked with: <dolfin.cpp.la.PETScVector object at
0x7fe71f9b0468>

python/test/unit/jit/test_jit.py:221: TypeError

            Vec x = vec->vec();
            assert(x);
            VecExp(x);
          }
    
        PYBIND11_MODULE(SIGNATURE, m)
        {
          m.def("PETSc_exp", &PETSc_exp);
        }
        """__________________________
test_creation_and_marking ___________________________

    def test_creation_and_marking():
    
        class Left(SubDomain):
            def inside(self, x, on_boundary):
                return x[0] < DOLFIN_EPS
          #include <pybind11/pybind11.h>
    
          #include <petscvec.h>
          #include <dolfin/la/PETScVector.h>
    
          void PETSc_exp(std::shared_ptr<dolfin::PETScVector> vec)
          {
            Vec x = vec->vec();
            assert(x);
            VecExp(x);
          }
    
        PYBIND11_MODULE(SIGNATURE, m)
        {
          m.def("PETSc_exp", &PETSc_exp);
        }
        """
    
        ext_module = compile_cpp_code(code)
    
        vec = PETScVector(MPI.comm_world, 10)
        np_vec = vec.get_local()
        np_vec[:] = arange(len(np_vec))
        vec.set_local(np_vec)
>       ext_module.PETSc_exp(vec)

E       TypeError: PETSc_exp(): incompatible function
arguments. The following argument types are supported:
E           1. (arg0: dolfin::PETScVector) -> None
E       
E       Invoked with: <dolfin.cpp.la.PETScVector object at
0x7f488a5b7468>

python/test/unit/jit/test_jit.py:221: TypeError
    
        ext_module = compile_cpp_code(code)
    
        vec = PETScVector(MPI.comm_world, 10)
        np_vec = vec.get_local()
        np_vec[:] = arange(len(np_vec))
        vec.set_local(np_vec)
>       ext_module.PETSc_exp(vec)
E       TypeError: PETSc_exp(): incompatible function
arguments. The following argument types are supported:
E           1. (arg0: dolfin::PETScVector) ->
None__________________________ test_creation_and_marking
___________________________

    def test_creation_and_marking():
    
        class Left(SubDomain):
            def inside(self, x, on_boundary):
                return x[0] < DOLFIN_EPS
    
        class LeftOnBoundary(SubDomain):
            def inside(self, x, on_boundary):
                return x[0] < DOLFIN_EPS and on_boundary
    
        class Right(SubDomain):
            def inside(self, x, on_boundary):
                return x[0] > 1.0 - DOLFIN_EPS
    
        class RightOnBoundary(SubDomain):
            def inside(self, x, on_boundary):
                return x[0] > 1.0 - DOLFIN_EPS and on_boundary
    
        cpp_code = """
            #include<pybind11/pybind11.h>
            #include<pybind11/eigen.h>
            namespace py = pybind11;
    
            #include<Eigen/Dense>
            #include<dolfin/mesh/SubDomain.h>
    
            class Left : public dolfin::SubDomain
            {
            public:
    
        class LeftOnBoundary(SubDomain):
            def inside(self, x, on_boundary):
                return x[0] < DOLFIN_EPS and on_boundary
    
        class Right(SubDomain):
            def inside(self, x, on_boundary):
                return x[0] > 1.0 - DOLFIN_EPS
    
        class RightOnBoundary(SubDomain):
            def inside(self, x, on_boundary):
                return x[0] > 1.0 - DOLFIN_EPS and on_boundary
    
        cpp_code = """
            #include<pybind11/pybind11.h>
            #include<pybind11/eigen.h>
            namespace py = pybind11;
    
            #include<Eigen/Dense>
            #include<dolfin/mesh/SubDomain.h>
    
            class Left : public dolfin::SubDomain
            {
            public:
    
              virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const
              {
                return x[0] < DOLFIN_EPS;
              }
            };
    
            class LeftOnBoundary : public dolfin::SubDomain
            {
            public:
    
              virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const

              {
                return x[0] < DOLFIN_EPS and on_boundary;
              }
            };
    
            class Right : public dolfin::SubDomain
            {
            public:
    
              virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const
              {
                return x[0] > 1.0 - DOLFIN_EPS;
              }
            };
    
            class RightOnBoundary : public dolfin::SubDomain
            {
            public:
    
              virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const
              {
                return x[0] > 1.0 - DOLFIN_EPS and on_boundary;
              }
            };
    
        PYBIND11_MODULE(SIGNATURE, m) {
           py::class_<Left, std::shared_ptr<Left>,
dolfin::SubDomain>(m, "Left").def(py::init<>());
           py::class_<Right, std::shared_ptr<Right>,
dolfin::SubDomain>(m, "Right").def(py::init<>());
           py::class_<LeftOnBoundary,
std::shared_ptr<LeftOnBoundary>, dolfin::SubDomain>(m,
"LeftOnBoundary").def(py::init<>());
           py::class_<RightOnBoundary,
std::shared_ptr<RightOnBoundary>, dolfin::SubDomain>(m,
"RightOnBoundary").def(py::init<>());
        }
        """
    
>       compiled_domain_module = compile_cpp_code(cpp_code)

python/test/unit/mesh/test_sub_domain.py:127:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _
/usr/lib/python3/dist-packages/dolfin/jit/pybind11jit.py:87:
in compile_cpp_code
    generate=jit_generate)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _

E       args = ('\n        #include<pybind11/pybind11.h>\n
      #include<pybind11/eigen.h>\n        namespace py = pybind11;\n\n
...ache': {'cache_dir': None, 'comm_dir': 'comm', 'enable_build_log':
True, 'fail_dir_root': None, ...}, 'generator': {}})

kwargs = {'generate': <function jit_generate at 0x7fe726463488>}
mpi_comm = <mpi4py.MPI.Intracomm object at 0x7fe71ecd8090>, status = 0

    
              virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const
              {
                return x[0] < DOLFIN_EPS;
              }
            };
    
            class LeftOnBoundary : public dolfin::SubDomain
            {
            public:
    
              virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const
              {
                return x[0] < DOLFIN_EPS and on_boundary;
              }
            };
    
            class Right : public dolfin::SubDomain
            {
            public:
    
              virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const
              {
                return x[0] > 1.0 - DOLFIN_EPS;
              }
            };
    
            class RightOnBoundary : public dolfin::SubDomain
            {
            public:
    
              virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const
              {
                return x[0] > 1.0 - DOLFIN_EPS and on_boundary;
              }
            };
    
        PYBIND11_MODULE(SIGNATURE, m) {
           py::class_<Left, std::shared_ptr<Left>,
dolfin::SubDomain>(m, "Left").def(py::init<>());
           py::class_<Right, std::shared_ptr<Right>,
dolfin::SubDomain>(m, "Right").def(py::init<>());
           py::class_<LeftOnBoundary,
std::shared_ptr<LeftOnBoundary>, dolfin::SubDomain>(m,
"LeftOnBoundary").def(py::init<>());
           py::class_<RightOnBoundary,
std::shared_ptr<RightOnBoundary>, dolfin::SubDomain>(m,
"RightOnBoundary").def(py::init<>());
        }
        """
    
>       compiled_domain_module = compile_cpp_code(cpp_code)

python/test/unit/mesh/test_sub_domain.py:127:
E       Invoked with: <dolfin.cpp.la.PETScVector object at
0x7fab32897468>_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib/python3/dist-packages/dolfin/jit/pybind11jit.py
:87: in compile_cpp_code
    generate=jit_generate)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _
python/test/unit/jit/test_jit.py

:221: TypeErrorargs = ('\n        #include<pybind11/pybind11.h>\n
#include<pybind11/eigen.h>\n        namespace py = pybind11;\n\n
...ache': {'cache_dir': None, 'comm_dir': 'comm', 'enable_build_log':
True, 'fail_dir_root': None, ...}, 'generator': {}})
kwargs = {'generate': <function jit_generate at 0x7f489106f488>}

mpi_comm = <mpi4py.MPI.Intracomm object at 0x7f48898e0e50>, status = 1
root = True
error_msg = 'generic_type: type "Left" referenced unknown base type
"dolfin::SubDomain"'
global_status = 1.0

    @wraps(local_jit)
    def mpi_jit(*args, **kwargs):
    
        # FIXME: should require mpi_comm to be explicit
        # and not default to comm_world?
        mpi_comm = kwargs.pop("mpi_comm", MPI.comm_world)
    
        # Just call JIT compiler when running in serial
        if MPI.size(mpi_comm) == 1:
            return local_jit(*args, **kwargs)
    
        # Default status (0 == ok, 1 == fail)
        status = 0
    
        # Compile first on process 0
        root = MPI.rank(mpi_comm) == 0
        if root:
            try:
                output = local_jit(*args, **kwargs)
            except Exception as e:
root = False, error_msg = 'Compilation failed on root node.'
global_status = 1.0

    @wraps(local_jit)
    def mpi_jit(*args, **kwargs):
    
        # FIXME: should require mpi_comm to be explicit
        # and not default to comm_world?
        mpi_comm = kwargs.pop("mpi_comm", MPI.comm_world)
    
        # Just call JIT compiler when running in serial
        if MPI.size(mpi_comm) == 1:
            return local_jit(*args, **kwargs)
    
        # Default status (0 == ok, 1 == fail)
        status = 0
    
        # Compile first on process 0
        root = MPI.rank(mpi_comm) == 0
        if root:
            try:
                output = local_jit(*args, **kwargs)
            except Exception as e:
                status = 1
                error_msg = str(e)
    
        # TODO: This would have lower overhead if using the
dijitso.jit
        # features to inject a waiting callback instead of waiting
out here.
        # That approach allows all processes to first look in the
cache,
        # introducing a barrier only on cache miss.
        # There's also a sketch in dijitso of how to make only one
        # process per physical cache directory do the compilation.
    
        # Wait for the compiling process to finish and get status
        # TODO: Would be better to broadcast the status from root
but this works.
        global_status = MPI.max(mpi_comm, status)
    
        if global_status == 0:
            # Success, call jit on all other processes
            # (this should just read the cache)
            if not root:
                output = local_jit(*args, **kwargs)
        else:
            # Fail simultaneously on all processes,
            # to allow catching the error without deadlock
                status = 1            if not root:

                error_msg = "Compilation failed on root node."
>           raise RuntimeError(error_msg)
E           RuntimeError: Compilation failed on root node.

/usr/lib/python3/dist-packages/dolfin/jit/jit.py:82:
RuntimeError
__________________________ test_creation_and_marking
___________________________                error_msg = str(e)
    
=============================== warnings summary
===============================
test/unit/jit/test_jit.py::test_nasty_jit_caching_bug
  /usr/lib/python3/dist-packages/ffc/jitcompiler.py:234:
QuadratureRepresentationDeprecationWarning:
  *** ===================================================== ***
  *** FFC: quadrature representation is deprecated! It will ***
  *** likely be removed in 2018.2.0 release. Use uflacs     ***
  *** representation instead.                               ***
  *** ===================================================== ***
    issue_deprecation_warning()
  /usr/lib/python3/dist-packages/ffc/jitcompiler.py:234:
QuadratureRepresentationDeprecationWarning:
  *** ===================================================== ***
  *** FFC: quadrature representation is deprecated! It will ***
  *** likely be removed in 2018.2.0 release. Use uflacs     ***
  *** representation instead.                               ***
  *** ===================================================== ***
    issue_deprecation_warning()

        # TODO: This would have lower overhead if using the
dijitso.jit
-- Docs: http://doc.pytest.org/en/latest/warnings.html
= 2 failed, 796 passed, 437 skipped, 35 xfailed, 2 warnings in
672.01 seconds ==
        # features to inject a waiting callback instead of waiting
out here.
        # That approach allows all processes to first look in the
cache,
        # introducing a barrier only on cache miss.
        # There's also a sketch in dijitso of how to make only one
        # process per physical cache directory do the compilation.
    
        # Wait for the compiling process to finish and get status
        # TODO: Would be better to broadcast the status from root
but this works.
        global_status = MPI.max(mpi_comm, status)
    
        if global_status == 0:
            # Success, call jit on all other processes


            # (this should just read the cache)    def
test_creation_and_marking():

            if not root:    

        class Left(SubDomain):                output =
local_jit(*args, **kwargs)

            def inside(self, x, on_boundary):        else:

            # Fail simultaneously on all processes,
      return x[0] < DOLFIN_EPS

            # to allow catching the error without deadlock
 
        class LeftOnBoundary(SubDomain):
            if not root:

            def inside(self, x, on_boundary):
error_msg = "Compilation failed on root node."
                return x[0] < DOLFIN_EPS and on_boundary
>           raise RuntimeError(error_msg)

E           RuntimeError: generic_type: type "Left" referenced
unknown base type "dolfin::SubDomain"    
        class Right(SubDomain):


/usr/lib/python3/dist-packages/dolfin/jit/jit.py
    def inside(self, x, on_boundary):
:82: RuntimeError                return x[0] > 1.0 - DOLFIN_EPS

    
        class RightOnBoundary(SubDomain):
            def inside(self, x, on_boundary):
                return x[0] > 1.0 - DOLFIN_EPS and on_boundary
    
        cpp_code = """
            #include<pybind11/pybind11.h>
            #include<pybind11/eigen.h>
            namespace py = pybind11;
    
            #include<Eigen/Dense>
            #include<dolfin/mesh/SubDomain.h>
    
            class Left : public dolfin::SubDomain
            {=============================== warnings
summary ===============================
            public:
    
              virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const
              {
                return x[0] < DOLFIN_EPS;
              }
            };
    
            class LeftOnBoundary : public dolfin::SubDomain
            {
            public:
    
              virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const

test/unit/jit/test_jit.py::test_nasty_jit_caching_bug              {

                return x[0] < DOLFIN_EPS and on_boundary;
/usr/lib/python3/dist-packages/ffc/jitcompiler.py:234:
QuadratureRepresentationDeprecationWarning:
  *** ===================================================== ***
  *** FFC: quadrature representation is deprecated! It will ***
  *** likely be removed in 2018.2.0 release. Use uflacs     ***
  *** representation instead.                               ***
  *** ===================================================== ***
    issue_deprecation_warning()

  /usr/lib/python3/dist-packages/ffc/jitcompiler.py:234:
QuadratureRepresentationDeprecationWarning:
  *** ===================================================== ***
  *** FFC: quadrature representation is deprecated! It will ***
  *** likely be removed in 2018.2.0 release. Use uflacs     ***
  *** representation instead.                               ***
  *** ===================================================== ***
    issue_deprecation_warning()              }

            };
-- Docs: http://doc.pytest.org/en/latest/warnings.html
    

= 2 failed, 796 passed, 437 skipped, 35 xfailed, 2 warnings in
671.92 seconds ==            class Right : public
dolfin::SubDomain
            {
            public:

    
              virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const
              {
                return x[0] > 1.0 - DOLFIN_EPS;
              }
            };
    
            class RightOnBoundary : public dolfin::SubDomain
            {
            public:
    
              virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const
              {
                return x[0] > 1.0 - DOLFIN_EPS and on_boundary;
              }
            };
    
        PYBIND11_MODULE(SIGNATURE, m) {
           py::class_<Left, std::shared_ptr<Left>,
dolfin::SubDomain>(m, "Left").def(py::init<>());
           py::class_<Right, std::shared_ptr<Right>,
dolfin::SubDomain>(m, "Right").def(py::init<>());
           py::class_<LeftOnBoundary,
std::shared_ptr<LeftOnBoundary>, dolfin::SubDomain>(m,
"LeftOnBoundary").def(py::init<>());
           py::class_<RightOnBoundary,
std::shared_ptr<RightOnBoundary>, dolfin::SubDomain>(m,
"RightOnBoundary").def(py::init<>());
        }
        """
    
>       compiled_domain_module = compile_cpp_code(cpp_code)

python/test/unit/mesh/test_sub_domain.py:127:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _
/usr/lib/python3/dist-packages/dolfin/jit/pybind11jit.py:87:
in compile_cpp_code
    generate=jit_generate)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _

args = ('\n        #include<pybind11/pybind11.h>\n
#include<pybind11/eigen.h>\n        namespace py = pybind11;\n\n
...ache': {'cache_dir': None, 'comm_dir': 'comm', 'enable_build_log':
True, 'fail_dir_root': None, ...}, 'generator': {}})
kwargs = {'generate': <function jit_generate at 0x7fab3934b488>}
mpi_comm = <mpi4py.MPI.Intracomm object at 0x7fab31c00230>, status = 0
root = False, error_msg = 'Compilation failed on root node.'
global_status = 1.0

    @wraps(local_jit)
    def mpi_jit(*args, **kwargs):
    
        # FIXME: should require mpi_comm to be explicit
        # and not default to comm_world?
        mpi_comm = kwargs.pop("mpi_comm", MPI.comm_world)
    
        # Just call JIT compiler when running in serial
        if MPI.size(mpi_comm) == 1:
            return local_jit(*args, **kwargs)
    
        # Default status (0 == ok, 1 == fail)
        status = 0
    
        # Compile first on process 0
        root = MPI.rank(mpi_comm) == 0
        if root:
            try:
                output = local_jit(*args, **kwargs)
            except Exception as e:
                status = 1
                error_msg = str(e)
    
        # TODO: This would have lower overhead if using the
dijitso.jit
        # features to inject a waiting callback instead of waiting
out here.
        # That approach allows all processes to first look in the
cache,
        # introducing a barrier only on cache miss.
        # There's also a sketch in dijitso of how to make only one
        # process per physical cache directory do the compilation.
    
        # Wait for the compiling process to finish and get status
        # TODO: Would be better to broadcast the status from root
but this works.
        global_status = MPI.max(mpi_comm, status)
    
        if global_status == 0:
            # Success, call jit on all other processes
            # (this should just read the cache)
            if not root:
                output = local_jit(*args, **kwargs)
        else:
            # Fail simultaneously on all processes,
            # to allow catching the error without deadlock
            if not root:
                error_msg = "Compilation failed on root node."
>           raise RuntimeError(error_msg)
E           RuntimeError: Compilation failed on root node.

/usr/lib/python3/dist-packages/dolfin/jit/jit.py:82:
RuntimeError
=============================== warnings summary
===============================
test/unit/jit/test_jit.py::test_nasty_jit_caching_bug
  /usr/lib/python3/dist-packages/ffc/jitcompiler.py:234:
QuadratureRepresentationDeprecationWarning:
  *** ===================================================== ***
  *** FFC: quadrature representation is deprecated! It will ***
  *** likely be removed in 2018.2.0 release. Use uflacs     ***
  *** representation instead.                               ***
  *** ===================================================== ***
    issue_deprecation_warning()
  /usr/lib/python3/dist-packages/ffc/jitcompiler.py:234:
QuadratureRepresentationDeprecationWarning:
  *** ===================================================== ***
  *** FFC: quadrature representation is deprecated! It will ***
  *** likely be removed in 2018.2.0 release. Use uflacs     ***
  *** representation instead.                               ***
  *** ===================================================== ***
    issue_deprecation_warning()

-- Docs: http://doc.pytest.org/en/latest/warnings.html
= 2 failed, 796 passed, 437 skipped, 35 xfailed, 2 warnings in
671.96 seconds ==
-------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
-------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status,
thus causing
the job to be terminated. The first process to do so was:

  Process name: [[60892,1],1]
  Exit code:    1
--------------------------------------------------------------------------

Attachment: signature.asc
Description: OpenPGP digital signature


--- End Message ---
--- Begin Message ---
Source: dolfin
Source-Version: 2018.1.0.post1-12

We believe that the bug you reported is fixed in the latest version of
dolfin, which is due to be installed in the Debian FTP archive.

A summary of the changes between this version and the previous one is
attached.

Thank you for reporting the bug, which will now be closed.  If you
have further comments please address them to [email protected],
and the maintainer will reopen the bug report if appropriate.

Debian distribution maintenance software
pp.
Drew Parsons <[email protected]> (supplier of updated dolfin package)

(This message was generated automatically at their request; if you
believe that there is a problem with it please contact the archive
administrators by mailing [email protected])


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA256

Format: 1.8
Date: Mon, 24 Sep 2018 03:03:27 +0800
Source: dolfin
Binary: libdolfin-dev libdolfin2018.1 python3-dolfin python-dolfin dolfin-doc 
dolfin-bin
Architecture: source
Version: 2018.1.0.post1-12
Distribution: unstable
Urgency: medium
Maintainer: Debian Science Team 
<[email protected]>
Changed-By: Drew Parsons <[email protected]>
Description:
 dolfin-bin - Executable scripts for DOLFIN
 dolfin-doc - Documentation and demo programs for DOLFIN
 libdolfin-dev - Shared links and header files for DOLFIN
 libdolfin2018.1 - Shared libraries for DOLFIN
 python-dolfin - Python interface for DOLFIN
 python3-dolfin - Python interface for DOLFIN (Python 3)
Closes: 909407
Changes:
 dolfin (2018.1.0.post1-12) unstable; urgency=medium
 .
   * apply strict versioned depends on pybind11,
     python3-dolfin Depends: pybind11 (>= 2.2.4), pybind11 (<= 2.2.5)
     Closes: #909407.
   * Hardening: add dpkg-buildflags CXXFLAGS to CMAKE_CXX_FLAGS and
     python module build
   * use VERBOSE build of python modules
Checksums-Sha1:
 3d8f969bdd9fc9bbd9bdf1ac3e972317a2f9985e 3381 dolfin_2018.1.0.post1-12.dsc
 1e397b624f3fe3d596941334782f689537f39901 24296 
dolfin_2018.1.0.post1-12.debian.tar.xz
Checksums-Sha256:
 5984e41d8bafdd953c37168350bcb6b4075809fe364c9a0ba61a79ca2581261e 3381 
dolfin_2018.1.0.post1-12.dsc
 48eec165a0e8df350b78f1a20db48c5605d5262673ac5c3a8d23441f972ec1c7 24296 
dolfin_2018.1.0.post1-12.debian.tar.xz
Files:
 3463bf95f5114251f312b0c0596b8479 3381 math optional 
dolfin_2018.1.0.post1-12.dsc
 fdf73de138ca53033bd552220af5c123 24296 math optional 
dolfin_2018.1.0.post1-12.debian.tar.xz

-----BEGIN PGP SIGNATURE-----

iQIzBAEBCAAdFiEEI8mpPlhYGekSbQo2Vz7x5L1aAfoFAlun7TEACgkQVz7x5L1a
AfpzMQ//U1SZ3RlkT0E0PVrlgWkK/jGRcYpWgo9MCrZleciKPk4nmNboe3wMLqpS
mF75jgRbL8sszadsjaS2+MqqbMPJt8nUUL5S6hch+FGtwaI8pp62fIoQpyBC4bAF
SXjNnaLeMP0/4FPHKC8iuFCAxwhURU/B3TQSeSFrCy5vi2V1mxAYa1najzlTwQir
9bad2AAIxecY6H4c28PrRNJICUj5c/KrnDVkxmJgVLNOaTzdHigKhdG9z9mKxq6t
DfwX02vZBNZKTQTTLtTdtDvmqVGnAl8n5U2x93nhfkVAw0sKM2AIo1NhHF6i/yyD
n5FU/6AYjiYP53zYCNT7/n3AKiRNPN3+/snotKNA0ymcie8uyeyaGVSHxKCQ3hVA
m6qcxpoM/8i9Zd4OE7CKpL+dH+FjpTeF/idX4gAkDvKls7l3gF/jRou0M4L+aJJ2
AVAaPCy58qIbRkvhsJbz3ts0UOYoFKp5BMPFjnzAUahrR5SvPPw7VdUDmocKRp6Z
XxpbxbKOTJDiY7xPP4oPQ9aOZwceog+RiS2bYdg/dLKmEgsjJD5BC9UNiuSiADEb
UGZdVos+QStBh/HJGKzyrx673tdwkBsTyuDJIrCFF0RhFCwk5zSYKzKLrj7sHH0e
o6C05S1xRM5IQQssWGCqrvkZ4UaukIulgxLgtGa0YotsNevJMZE=
=YHXj
-----END PGP SIGNATURE-----

--- End Message ---

Reply via email to