Re: autopkgtest and conda meta.yaml files
> Do you think there is value in those that we don't have in > autopkgtest-pkg-pybuild? in my experience conda test commands are just > ad-hoc commands to run e.g. pytest, but my experience with them is narrow. I do not have lot's of experience for now, but at least the test suite is most of the time maintained by the upstream. so it seems to me that running their own testsuite is something valuable. Fred
autopkgtest and conda meta.yaml files
Hello, do you know if it exist a tool which automatically produce autopkgtest test snipset from a conda meta.yaml file ? since plenty of upstream are writing these kind of scripts, it would be great to execute these test via autopkgtests when possible. I have a student which is writting a tool whcih extract the tests of the meta and try to produce a script which execute these tests. do you think that this should be of some intereset for Debian ?, how to integrat this in the packaging process. thanks Frederic
Re: Policy Change Proposal: Running the upstream test suite
> Those will correspond to build dependencies in a typical simple case, as > the upstream tests need all those optional deps, assuming you mean the > tests dependencies (and not the runtime dependencies which should already > be in Depends). Yes the runtime dependencies should be listed in the python3- Depends: > >> Maybe we should install only the python binaries and the dependencies marked >> . > > In a typical simple case all B-Ds except sphinx stuff will be > as you don't need anything beyond the build system to "build" a pure > Python module. > So this is exactly what you wanted to avoid. Yes, I want to be sure that all the runtime dependencies are rightfully declared in the Depends of the python module package. sometimes the upstream forgot about dependencies, or mark them as optional, but they are not when running the tests... Is it possible to achieve this automatically ? lintian ? dh-python ? python3-stdeb ? Fred. PS: In haskell we have a cabal-debian package which compute automatically the list of the Dependencies, from the cabal file, then it update the control file to reflect the new status of the package.
Re: Policy Change Proposal: Running the upstream test suite
> Maybe we indeed want a "minimal" autopkgtest environment, but many > upstream tests will fail in those and I don't see an automatic way to test > a random package in this way. Even if not minimal, at least correspond to the upstream declares dependencies. by 'declare' I am not even sure of the meaning. dependencies of the test can be different from the required installed ones. Maybe we should install only the python binaries and the dependencies marked . Is there a standard in the Python community for the test dependencies ?
Re: Policy Change Proposal: Running the upstream test suite
My use case, is to check that all the Dependencies computer by dh_python3 from the build tools are indeed listed in the Depends of the binary package. I think about package whcih provide oiptional dependencies via the extra. In that case the extra should be declare when calling dh_python3. So I must install only the python3- and the test framework to be sure that it is fonctionnal as installed. It seems to me that the autopkgtest-pybuild plugin is not well suited for this purpose. (as it is now). Cheers Frederic For piupart, I alsao have some test that should be perfore with one of my package the upgrade of a database. I should install the package, upgrade it and check that the new version has the right database upgraded. Is it possible to test upgrade scenario with autopkgtests ?
Re: Policy Change Proposal: Running the upstream test suite
I have one concerne when using pybuild autopkgtest. It install by default the build dependencies. which is not what peoples are doing when they install the packages. It should be possible to define the autopkgtest dependencies. This way we could catch missing dependencies in python- dependencies. Is it possible ? Cheers Frederic
Re: install entry points in a dedicated binary package
> I'm not 100% sure I understand your question, but is something > preventing you from installing the script with a > debian/binoculars.install file? nothink but it seems to me (I may be wrong), that pybuild install all files directly in the python3- package Am I wrong ? I end up with the scripts in the python3- package. pybuid is clever enought to see that script should go into another package and not the default python3- Cheers
install entry points in a dedicated binary package
Hello, I am modernizing the binoculars package. I switch it to pyproject.toml and now I need to update the packaging. I would like your advices in order to replace this d/rules --- export DH_VERBOSE=1 export PYBUILD_NAME=binoculars export PYBUILD_AFTER_INSTALL=rm -rf {destdir}/usr/bin/ %: dh $@ --with python3,sphinxdoc --buildsystem=pybuild override_dh_auto_test: dh_auto_test -- --system=custom --test-args='{interpreter} -m unittest discover -s tests -t {dir} -v' override_dh_install: dh_numpy3 dh_install # install scripts into binoculars python3 setup.py install_scripts -d debian/binoculars/usr/bin override_dh_sphinxdoc: ifeq (,$(findstring nodocs, $(DEB_BUILD_OPTIONS))) PYTHONPATH=. http_proxy='127.0.0.1:9' sphinx-build -N -bhtml doc/source build/html # HTML generator dh_installdocs -p binoculars-doc "build/html" dh_sphinxdoc -O--buildsystem=pybuild endif --- with the something fonctionnaly equivalent which install the entry points only in the binoculars package. replace this combination export PYBUILD_AFTER_INSTALL=rm -rf {destdir}/usr/bin/ ... # install scripts into binoculars python3 setup.py install_scripts -d debian/binoculars/usr/bin With ??? thanks for your help Frederic
Re: [Debian-pan-maintainers] [Help] Re: python-future: FTBFS: dh_auto_test: error: pybuild --test --test-pytest -i python{version} -p 3.11 returned exit code 13
ok for me - Le 4 Jan 24, à 13:19, Alexandre Detiste alexandre.deti...@gmail.com a écrit : > Le jeu. 4 janv. 2024 à 07:48, Andreas Tille a écrit : >> > @Vincent: this one package "gtextfsm" is yours >> > do you green light an upload ? >> >> If you ask me the package is team maintained and a "Team upload" >> should be fine. > > Hi, I just try to follow the rules I agreed on last month. > > https://salsa.debian.org/python-team/tools/python-modules/blob/master/policy.rst#id2 > >| Team in Uploaders is a weak statement of collaboration. Help in > maintaining the package is appreciated, >| commits to the Git repository are freely welcomed, but before > uploading, please contact the Maintainer for the green light. > > There are not so many packages where "Uploader = DPT" to begin with, > so this might not well a well-known practice... > > So I'm formally asking Ana & PaN for approval to upload "lexicon" and > "dioptas". > (lexicon is a one line change, dioptas needs to package a new release) > > @Vincent: thanks. > > Greetings > > - > > Debian Python Team > dioptas (U) > gtextfsm (U) > lexicon (U) > > Ana Custura > lexicon > > Debian PaN Maintainers > dioptas > > -- > Debian-pan-maintainers mailing list > debian-pan-maintain...@alioth-lists.debian.net > https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/debian-pan-maintainers
bug in DebHelper or pybuild when deteting the plugin system.
Hello, I am working in order to solve the bitshuffle FTBFS. I took the liberty to modernize the package in my NMU. But once I added pybuild-plugin-pyproject the build failed with this error message dh_auto_clean -- -v pybuild --clean -i python{version} -p "3.12 3.11" -v D: pybuild pybuild:614: version: 6.20231204 D: pybuild pybuild:615: ['/usr/bin/pybuild', '--clean', '-i', 'python{version}', '-p', '3.12 3.11', '-v'] D: pybuild pybuild:39: cfg: Namespace(verbose=True, quiet=False, really_quiet=False, detect_only=False, clean_only=True, configure_only=False, build_only=False, install_only=False, test_only=False, autopkgtest_only=False, list_systems=False, print_args=None, before_clean=None, clean_args=None, after_clean=None, before_configure=None, configure_args=None, after_configure=None, before_build=None, build_args=None, after_build=None, before_install=None, install_args=None, after_install=None, before_test=None, test_args=None, after_test=None, test_nose=False, test_nose2=False, test_pytest=False, test_tox=False, test_custom=False, dir='/home/picca/debian/picca/bitshuffle', destdir='debian/tmp', ext_destdir=None, ext_pattern='\\.so(\\.[^/]*)?$', ext_sub_pattern=None, ext_sub_repl=None, install_dir=None, name='bitshuffle', system=None, versions=['3.12', '3.11'], interpreter=['python{version}'], disable=None, custom_tests=False) D: pybuild tools:232: invoking: /usr/bin/dpkg-architecture D: pybuild debhelper:183: source=bitshuffle, binary packages=['bitshuffle'] E: pybuild pybuild:122: unrecognized build system: pyproject dh_auto_clean: error: pybuild --clean -i python{version} -p "3.12 3.11" -v returned exit code 10 So I instrumented a bit the pybuild code with some print... # Selected by build_dep? if not selected_plugin: dh = DebHelper(build_options()) for build_dep in dh.build_depends: if build_dep.startswith('pybuild-plugin-'): print(f"'{build_dep}'") print() selected_plugin = build_dep.split('-', 2)[2] break if selected_plugin: certainty = 99 print() print(f'"{selected_plugin}"') print() print(build.plugins) print() Plugin = build.plugins.get(selected_plugin) print(Plugin) print() if not Plugin: log.error('unrecognized build system: %s', selected_plugin) exit(10) plugin = Plugin(cfg) context = {'ENV': env, 'args': {}, 'dir': cfg.dir} plugin.detect(context) And I discoverd this --- dh clean --buildsystem=pybuild debian/rules override_dh_auto_clean make[1] : on entre dans le répertoire « /home/picca/debian/picca/bitshuffle » rm -rf lzf/lzf rm -rf lz4 dh_auto_clean -- -v pybuild --clean -i python{version} -p "3.12 3.11" -v D: pybuild pybuild:614: version: 6.20231204 D: pybuild pybuild:615: ['/usr/bin/pybuild', '--clean', '-i', 'python{version}', '-p', '3.12 3.11', '-v'] D: pybuild pybuild:39: cfg: Namespace(verbose=True, quiet=False, really_quiet=False, detect_only=False, clean_only=True, configure_only=False, build_only=False, install_only=False, test_only=False, autopkgtest_only=False, list_systems=False, print_args=None, before_clean=None, clean_args=None, after_clean=None, before_configure=None, configure_args=None, after_configure=None, before_build=None, build_args=None, after_build=None, before_install=None, install_args=None, after_install=None, before_test=None, test_args=None, after_test=None, test_nose=False, test_nose2=False, test_pytest=False, test_tox=False, test_custom=False, dir='/home/picca/debian/picca/bitshuffle', destdir='debian/tmp', ext_destdir=None, ext_pattern='\\.so(\\.[^/]*)?$', ext_sub_pattern=None, ext_sub_repl=None, install_dir=None, name='bitshuffle', system=None, versions=['3.12', '3.11'], interpreter=['python{version}'], disable=None, custom_tests=False) D: pybuild tools:232: invoking: /usr/bin/dpkg-architecture D: pybuild debhelper:183: source=bitshuffle, binary packages=['bitshuffle'] 'pybuild-plugin-pyproject ' "pyproject " {'autopkgtest': , 'cmake': , 'custom': , 'distutils': , 'flit': , 'meson': , 'pyproject': } None E: pybuild pybuild:122: unrecognized build system: pyproject --- So the detected plugin is 'pyproject ' with trailling whitespaces. And this comes strait from the DebHelper build_dep parsing 'pybuild-plugin-pyproject ' This is due to the peticular organisation of the Build-Depends of the Maintainer. Build-Depends: debhelper-compat (= 13) , dh-exec , dh-python , dh-sequence-numpy3 , dh-sequence-python3 , pybuild-plugin-pyproject , python3-all-dev , python3-num
pybuild and optional dependencies
Hello, I am updating the xraylarch package which contain something like this in the setup.cfg ``` install_requires = asteval>=0.9.28 numpy>=1.20 scipy>=1.7 uncertainties>=3.1.4 lmfit>=1.2.1 pyshortcuts>=1.9.0 xraydb>=4.5 silx>=0.15.2 matplotlib>=3.5 sqlalchemy>=2.0 sqlalchemy_utils h5py>=3.2 hdf5plugin pillow>=8.3.2 numdifftools pandas packaging pip pyyaml toml termcolor dill imageio charset-normalizer peakutils requests scikit-image scikit-learn psutil pymatgen mp_api pycifrw fabio pyfai numexpr==2.8.4; python_version < '3.9' numexpr>=2.8.7; python_version > '3.8' [options.extras_require] wxgui = wxpython>=4.1 wxutils>=0.3.0 wxmplot>=0.9.57 qtgui = PyQt5 pyqtwebengine pyqtgraph epics = pyepics>=3.5.0 psycopg2-binary jupyter = jupyter_core>=5.0 jupyter_client jupyter_server notebook nbformat ipywidgets plotly py3dmol doc = sphinx numpydoc sphinxcontrib-bibtex sphinxcontrib-argdoc pycairo; platform_system=="Windows" dev = pytest pytest-cov coverage build pre-commit twine larix = %(wxgui)s %(jupyter)s all = %(dev)s %(doc)s %(wxgui)s %(qtgui)s %(jupyter)s %(epics)s ``` When I compile the package, I got the dh_python3 computed runtime dependencies from the install_requires. Now I would like to build this package but with the larix optional dependencies. so I added all the dependencies in the Build-Depends, but dh_python3 still produce the previous dependencies. How can I teach pybuild that I really want xraylarch[larix] ? thanks Frederic
[long] entry points and package which doesn not provide egg info or dist-info
Hello, I am the maintainer of silx I have this problem with the gui application $ silx view Traceback (most recent call last): File "/usr/bin/silx", line 33, in sys.exit(load_entry_point('silx==1.1.2', 'console_scripts', 'silx')()) File "/usr/lib/python3/dist-packages/silx/__main__.py", line 67, in main status = launcher.execute(sys.argv) ^^ File "/usr/lib/python3/dist-packages/silx/utils/launcher.py", line 294, in execute return command.execute(command_argv) ^ File "/usr/lib/python3/dist-packages/silx/utils/launcher.py", line 128, in execute status = func(argv) ^^ File "/usr/lib/python3/dist-packages/silx/app/view/main.py", line 214, in main mainQt(options) File "/usr/lib/python3/dist-packages/silx/app/view/main.py", line 156, in mainQt import silx.gui.utils.matplotlib # noqa File "/usr/lib/python3/dist-packages/silx/gui/utils/matplotlib.py", line 39, in from pkg_resources import parse_version File "/usr/lib/python3/dist-packages/pkg_resources/__init__.py", line 3328, in @_call_aside ^^^ File "/usr/lib/python3/dist-packages/pkg_resources/__init__.py", line 3303, in _call_aside f(*args, **kwargs) File "/usr/lib/python3/dist-packages/pkg_resources/__init__.py", line 3341, in _initialize_master_working_set working_set = WorkingSet._build_master() ^^ File "/usr/lib/python3/dist-packages/pkg_resources/__init__.py", line 631, in _build_master ws.require(__requires__) File "/usr/lib/python3/dist-packages/pkg_resources/__init__.py", line 969, in require needed = self.resolve(parse_requirements(requirements)) ^^ File "/usr/lib/python3/dist-packages/pkg_resources/__init__.py", line 829, in resolve dist = self._resolve_dist( ^^^ File "/usr/lib/python3/dist-packages/pkg_resources/__init__.py", line 871, in _resolve_dist raise DistributionNotFound(req, requirers) pkg_resources.DistributionNotFound: The 'h5py' distribution was not found and is required by 'hdf5plugin silx' So the entry point do not find the h5py package even if python3-h5py is installed via python3-h5py-serial I reported the issue to the h5py maintainer via this bug report. https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1051781 This package is a bit special since it provide a serial implementation python3-h5py-serial and an MPI one via the python3-h5py-mpi. both packages are co-installable. In each of these packages there is an /usr/lib/python3/dist-packages/h5py._debian_h5py_serial-3.7.0.dist-info/METADATA /usr/lib/python3/dist-packages/h5py._debian_h5py_serial-3.7.0.dist-info/RECORD /usr/lib/python3/dist-packages/h5py._debian_h5py_serial-3.7.0.dist-info/WHEEL /usr/lib/python3/dist-packages/h5py._debian_h5py_serial-3.7.0.dist-info/top_level.txt and /usr/lib/python3/dist-packages/h5py._debian_h5py_mpi-3.7.0.dist-info/METADATA /usr/lib/python3/dist-packages/h5py._debian_h5py_mpi-3.7.0.dist-info/RECORD /usr/lib/python3/dist-packages/h5py._debian_h5py_mpi-3.7.0.dist-info/WHEEL /usr/lib/python3/dist-packages/h5py._debian_h5py_mpi-3.7.0.dist-info/top_level.txt On Debian the two versions are named h5py._debian_h5py_serial and h5py._debian_h5py_mpi This organisation is specific to Debian since upstream provide only one version of the h5py package. On Debian we are able to co-install and MPI and a serial version of this library linked to the serial or mpi version of the hdf5 library. the differences are $ diff /usr/lib/python3/dist-packages/h5py._debian_h5py_serial-3.7.0.dist-info/METADATA /usr/lib/python3/dist-packages/h5py._debian_h5py_mpi-3.7.0.dist-info/METADATA 2c2 < Name: h5py.-debian-h5py-serial --- > Name: h5py.-debian-h5py-mpi 30a31 > Requires-Dist: mpi4py (>=3.0.2) so the name and a dependecy to the mpi4py package. for the RECORD part the installed files are located in two different directory, so they are totally different. top_level.txt WHEEL are identical There is a generic python3-h5py which is used in the B-D of packages depending on h5py. It is almost empty but depends on one of the two implementations. $ dpkg -L python3-h5py /. /usr /usr/share /usr/share/doc /usr/share/doc/python3-h5py /usr/share/doc/python3-h5py/README.Debian /usr/share/doc/python3-h5py/README.rst /usr/share/doc/python3-h5py/changelog.Debian.gz /usr/share/doc/python3-h5py/copyright $ apt show python3-h5py Package: python3-h5py Version: 3.7.0-8 Priority: optional Section: python Source: h5py Maintainer: Debian Science Maintainers Installed-Size: 22,5 kB Depends: python3-h5py-serial | python3-h5py-mpi This package does not provide a dist-info named h5py. So all entry points which depends on h5py
Re: python2:any dependency
The sheband distributed by the upstream did not changed #!python but previously it was replaced by a python3 shebang. here for the previous version D: dh_python3 dh_python3:179: version: 5.20230109 D: dh_python3 dh_python3:180: argv: ['/usr/bin/dh_python3', '-i', '-O--buildsystem=pybuild'] D: dh_python3 dh_python3:181: options: Namespace(guess_deps=True, skip_private=False, verbose=True, arch=False, package=None, no_package=None, remaining_packages=False, compile_all=False, vrange=None, regexpr=None, accept_upstream_versions=False, depends=None, depends_section=None, recommends=None, recommends_section=None, suggests=None, suggests_section=None, requires=None, shebang=None, ignore_shebangs=False, clean_dbg_pkg=True, no_ext_rename=False, no_shebang_rewrite=False, private_dir=None, O=['--buildsystem=pybuild']) D: dh_python3 dh_python3:182: supported Python versions: 3.10,3.11 (default=3.11) D: dh_python3 debhelper:175: skipping package pymca-data (missing ${python3:Depends} in Depends/Recommends) D: dh_python3 debhelper:175: skipping package pymca-doc (missing ${python3:Depends} in Depends/Recommends) D: dh_python3 debhelper:183: source=pymca, binary packages=['pymca'] D: dh_python3 dh_python3:204: processing package pymca... D: dh_python3 fs:400: package pymca details = {'requires.txt': set(), 'egg-info': set(), 'dist-info': set(), 'nsp.txt': set(), 'shebangs': {/usr/bin/python3, /usr/bin/python3, /usr/bin/python3, /usr/bin/python3, /usr/bin/python3, /usr/bin/python3, /usr/bin/python3, /usr/bin/python3, /usr/bin/python3}, 'public_vers': set(), 'private_dirs': {}, 'compile': False, 'ext_vers': set(), 'ext_no_version': set()} D: dh_python3 depends:117: generating dependencies for package pymca D: dh_python3 depends:281: D={'python3:any'}; R=[]; S=[]; E=[], B=[]; RT=[] dh_installsystemduser -i -O--buildsystem=pybuild Indeed the shebang was identical https://sources.debian.org/src/pymca/5.8.0%2Bdfsg-2/PyMca5/scripts/edfviewer/ #!python So the behaviour of dh_python3 changed. Now If I want to solve this what is the best solution - a patch which replace all python with python3 - dh_python3 --shebang=/usr/bin/python3 thanks for your advices. Frederic Fred
python2:any dependency
Hello, I updated pymca with the new upstream version, and now the pymca package depends on the missing python2:any package (which is not available in Debian). This dependency was generated by dh_python3 during the build all. dh_python3 -i -O--buildsystem=pybuild D: dh_python3 dh_python3:179: version: 5.20230603 D: dh_python3 dh_python3:180: argv: ['/usr/bin/dh_python3', '-i', '-O--buildsystem=pybuild'] D: dh_python3 dh_python3:181: options: Namespace(guess_deps=True, skip_private=False, verbose=True, arch=False, package=None, no_package=None, remaining_packages=False, compile_all=False, vrange=None, regexpr=None, accept_upstream_versions=False, depends=None, depends_section=None, recommends=None, recommends_section=None, suggests=None, suggests_section=None, requires=None, shebang=None, ignore_shebangs=False, clean_dbg_pkg=True, no_ext_rename=False, no_shebang_rewrite=False, private_dir=None, O=['--buildsystem=pybuild']) D: dh_python3 dh_python3:182: supported Python versions: 3.11 (default=3.11) D: dh_python3 debhelper:175: skipping package pymca-data (missing ${python3:Depends} in Depends/Recommends) D: dh_python3 debhelper:175: skipping package pymca-doc (missing ${python3:Depends} in Depends/Recommends) D: dh_python3 debhelper:183: source=pymca, binary packages=['pymca'] D: dh_python3 dh_python3:204: processing package pymca... I: dh_python3 tools:114: replacing shebang in debian/pymca/usr/bin/rgbcorrelator I: dh_python3 tools:114: replacing shebang in debian/pymca/usr/bin/pymcaroitool I: dh_python3 tools:114: replacing shebang in debian/pymca/usr/bin/pymcapostbatch I: dh_python3 tools:114: replacing shebang in debian/pymca/usr/bin/pymcabatch I: dh_python3 tools:114: replacing shebang in debian/pymca/usr/bin/pymca I: dh_python3 tools:114: replacing shebang in debian/pymca/usr/bin/peakidentifier I: dh_python3 tools:114: replacing shebang in debian/pymca/usr/bin/mca2edf I: dh_python3 tools:114: replacing shebang in debian/pymca/usr/bin/elementsinfo I: dh_python3 tools:114: replacing shebang in debian/pymca/usr/bin/edfviewerD: dh_python3 fs:400: package pymca details = {'requires.txt': set(), 'egg-info': set(), 'dist-info': set(), 'nsp.txt': set(), 'shebangs': {/usr/bin/python2, /usr/bin/python2, /usr/bin/python2, /usr/bin/python2, /usr/bin/python2, /usr/bin/python2, /usr/bin/python2, /usr/bin/python2, /usr/bin/python2}, 'public_vers': set(), 'private_dirs': {}, 'compile': False, 'ext_vers': set(), 'ext_no_version': set()} it was not the case before..., so my question is : "is it normal or a bung in dh_python3" Cheers Frederic
Re: installation of script in a dedicated package
Hello Louis > It seems the only thing this line does is to install /usr/bin/silx. This can > be > done 'manually' via > dh_install (see man dh_install). Yes it install only this script for now. I can do it by end. but in thaht case, I need to let python build the script from the entry point and then move it out of the pybuild directory. > I personally tend to prefer having a file like 'debian/python3-silx.install' > instead of having a > 'dh_install' line in 'debian/rules', as it yields a 'cleaner' d/rules file. I agree to disagree :). I like a lot the fact than I can see everythings going one for a package by opening only one file. I am more productive this way. Especially when I work on lot's of packages. cut en past beetween rules files is a lot easyer. the multipliction of the .install files seems to me a waste of time :). les gouts et les couleurs :). Cheers Frederic and thanks for your time.
installation of script in a dedicated package
Hello, I try to update the silx package and I want to replace this call python3 setup.py install_scripts -d debian/silx/usr/bin with the right call without setup.py. thanks for your help Frederic
Re: change in the extention importation with 3.11
There is a fix from the upstream around enum. https://github.com/boostorg/python/commit/a218babc8daee904a83f550fb66e5cb3f1cb3013 Fix enum_type_object type on Python 3.11 The enum_type_object type inherits from PyLong_Type which is not tracked by the GC. Instances doesn't have to be tracked by the GC: remove the Py_TPFLAGS_HAVE_GC flag. The Python C API documentation says: "To create a container type, the tp_flags field of the type object must include the Py_TPFLAGS_HAVE_GC and provide an implementation of the tp_traverse handler." https://docs.python.org/dev/c-api/gcsupport.html The new exception was introduced in Python 3.11 by: python/cpython#88429 an opinion ?
Re: change in the extention importation with 3.11
My pytango package has the same probleme... https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1024078 I: pybuild base:240: cd /<>/.pybuild/cpython3_3.11_tango/build; python3.11 -m pytest tests ImportError while loading conftest '/<>/.pybuild/cpython3_3.11_tango/build/tests/conftest.py'. tests/conftest.py:3: in from tango.test_utils import state, typed_values, server_green_mode tango/__init__.py:83: in from ._tango import ( E SystemError: initialization of _tango raised unreported exception It use also the boost_python library... something is smelly around boost_python and python3.11... Cheers.
Re: change in the extention importation with 3.11
in order to debug this, I started gdb set a breakpoint in init_module_scitbx_linalg_ext then a catch throw and I end up with this backtrace Catchpoint 2 (exception thrown), 0x770a90a1 in __cxxabiv1::__cxa_throw (obj=0xb542e0, tinfo=0x772d8200 , dest=0x772c1290 ) at ../../../../src/libstdc++-v3/libsupc++/eh_throw.cc:81 81 ../../../../src/libstdc++-v3/libsupc++/eh_throw.cc: Le dossier n'est pas vide. (gdb) bt #0 0x770a90a1 in __cxxabiv1::__cxa_throw (obj=0xb542e0, tinfo=0x772d8200 , dest=0x772c1290 ) at ../../../../src/libstdc++-v3/libsupc++/eh_throw.cc:81 #1 0x772ad089 in boost::python::throw_error_already_set () at libs/python/src/errors.cpp:61 #2 0x772b6f05 in boost::python::objects::(anonymous namespace)::new_enum_type (doc=0x0, name=0x7743ddf9 "bidiagonal_matrix_kind") at libs/python/src/object/enum.cpp:169 #3 boost::python::objects::enum_base::enum_base (this=this@entry=0x7fffcee0, name=name@entry=0x7743ddf9 "bidiagonal_matrix_kind", to_python=to_python@entry=0x7741f720 ::to_python(void const*)>, convertible=convertible@entry=0x77422e50 ::convertible_from_python(_object*)>, construct=construct@entry=0x7741fb60 ::construct(_object*, boost::python::converter::rvalue_from_python_stage1_data*)>, id=..., doc=0x0) at libs/python/src/object/enum.cpp:204 #4 0x774203cb in boost::python::enum_::enum_ (this=0x7fffcee0, name=0x7743ddf9 "bidiagonal_matrix_kind", doc=0x0) at /usr/include/boost/python/enum.hpp:45 #5 0x77428330 in scitbx::matrix::boost_python::bidiagonal_matrix_svd_decomposition_wrapper::wrap (name=name@entry=0x7743dbd0 "svd_decomposition_of_bidiagonal_matrix") at ./scitbx/linalg/boost_python/svd.cpp:19 #6 0x7741f6b0 in scitbx::matrix::boost_python::wrap_svd () at ./scitbx/linalg/boost_python/svd.cpp:66 #7 0x773f8aa3 in scitbx::matrix::boost_python::(anonymous namespace)::init_module () at ./scitbx/linalg/boost_python/linalg_ext.cpp:19 #8 0x772c13e3 in boost::function0::operator() (this=0x7fffd2b0) at ./boost/function/function_template.hpp:763 #9 boost::python::handle_exception_impl (f=...) at libs/python/src/errors.cpp:25 #10 0x772c1b69 in boost::python::handle_exception (f=) at ./boost/function/function_template.hpp:635 #11 boost::python::detail::(anonymous namespace)::init_module_in_scope (init_function=0x773f8ac0 , m=) at libs/python/src/module.cpp:24 #12 boost::python::detail::init_module (moduledef=..., init_function=0x773f8ac0 ) at libs/python/src/module.cpp:43 not crystal clear to me :)
Re: build package xrayutilities - wheel and pip with setuptools
Hello, just for info. I can confirm that the probleme was in the upstream build system, which was not compatible with the setuptools > 60. After patching the build system, I can confirm that without any change to the rules files. It works out of the box. So there is no issue in pybuild :)), or at least nothing new :) cheers Frederic
Re: build package xrayutilities - wheel and pip with setuptools
The upstream was right :), their build system is completly broken with setuptools > 60 I need to fix this. thanks for your help Fred Traceback (most recent call last): File "/<>/setup.py", line 1137, in setup_package() File "/<>/setup.py", line 1133, in setup_package setup(**setup_kwargs) File "/usr/lib/python3/dist-packages/setuptools/__init__.py", line 87, in setup return distutils.core.setup(**attrs) File "/usr/lib/python3/dist-packages/setuptools/_distutils/core.py", line 172, in setup ok = dist.parse_command_line() File "/usr/lib/python3/dist-packages/setuptools/_distutils/dist.py", line 474, in parse_command_line args = self._parse_command_opts(parser, args) File "/usr/lib/python3/dist-packages/setuptools/dist.py", line 1107, in _parse_command_opts nargs = _Distribution._parse_command_opts(self, parser, args) File "/usr/lib/python3/dist-packages/setuptools/_distutils/dist.py", line 540, in _parse_command_opts raise DistutilsClassError( distutils.errors.DistutilsClassError: command class must subclass Command E: pybuild pybuild:379: clean: plugin distutils failed with: exit code=1: python3.10 setup.py clean dh_auto_clean: error: pybuild --clean -i python{version} -p 3.10 returned exit code 13 make: *** [debian/rules:16: clean] Error 25 dpkg-buildpackage: error: fakeroot debian/rules clean subprocess returned exit status 2 - Mail d’origine - De: PICCA Frederic-Emmanuel <frederic-emmanuel.pi...@synchrotron-soleil.fr> À: Scott Kitterman <deb...@kitterman.com> Cc: debian-python@lists.debian.org Envoyé: Wed, 02 Nov 2022 09:03:45 +0100 (CET) Objet: Re: build package xrayutilities - wheel and pip with setuptools Ok, I understand better what is going on. in pyproject.toml there is these lines. [build-system] requires = [ "wheel", "setuptools<60.0.0", "oldest-supported-numpy", "scipy", "sphinx", "nbsphinx", "silx>=0.10", "Cython>=0.25" ] indeed the setuptools < 60, can not be fullfill, So I think that this is why it try to use pip in order to install the required version of setuptools. the strack trace seems to config this. dh clean --buildsystem=pybuild dh_auto_clean -O--buildsystem=pybuild install -d /<<BUILDDIR>>/pyfai-0.21.3\+dfsg1/debian/.debhelper/generated/_source/home pybuild --clean -i python{version} -p 3.10 I: pybuild base:240: python3.10 setup.py clean /<<PKGBUILDDIR>>/setup.py:43: DeprecationWarning: The distutils package is deprecated and slated for removal in Python 3.12. Use setuptools or check PEP 632 for potential alternatives from distutils.command.clean import clean as Clean /usr/lib/python3/dist-packages/_distutils_hack/__init__.py:18: UserWarning: Distutils was imported before Setuptools, but importing Setuptools also replaces the `distutils` module in `sys.modules`. This may lead to undesirable behaviors or errors. To avoid these issues, avoid using distutils directly, ensure that setuptools is installed in the traditional way (e.g. not an editable install), and/or make sure that setuptools is always imported before distutils. warnings.warn( /usr/lib/python3/dist-packages/_distutils_hack/__init__.py:33: UserWarning: Setuptools is replacing distutils. warnings.warn("Setuptools is replacing distutils.") INFO: Disabling color, you really want to install colorlog. INFO:pythran:Disabling color, you really want to install colorlog. INFO:pyFAI.setup:Use setuptools with cython INFO:pyFAI.setup:Use setuptools.setup /usr/lib/python3/dist-packages/setuptools/installer.py:27: SetuptoolsDeprecationWarning: setuptools.installer is deprecated. Requirements should be satisfied by a PEP 517 installer. warnings.warn( /usr/lib/python3/dist-packages/setuptools/installer.py:60: UserWarning: Unbuilt egg for pyFAI [unknown version] (/<<PKGBUILDDIR>>) environment = pkg_resources.Environment() WARNING: Retrying (Retry(total=4, connect=None, read=None, redirect=None, status=None)) after connection broken by 'ProxyError('Cannot connect to proxy.', NewConnectionError('<pip._vendor.urllib3.connection.HTTPSConnection object at 0x7fbbf0c8ae60>: Failed to establish a new connection: [Errno 111] Connection refused'))': /simple/setuptools/ WARNING: Retrying (Retry(total=3, connect=None, read=None, redirect=None, status=None)) after connection broken by 'ProxyError('Cannot connect to proxy.', NewConnectionError('<pip._vendor.urllib3.connection.HTTPSConnection object at 0x7fbbf0c8ad40>: Failed to establish a new connection: [Errno 111] Connection refused'))': /simple/setuptools/ WARNING: Retrying (Retry(total=2, connect=None, read=None, redirect=None, status=N
Re: build package xrayutilities - wheel and pip with setuptools
Ok, I understand better what is going on. in pyproject.toml there is these lines. [build-system] requires = [ "wheel", "setuptools<60.0.0", "oldest-supported-numpy", "scipy", "sphinx", "nbsphinx", "silx>=0.10", "Cython>=0.25" ] indeed the setuptools < 60, can not be fullfill, So I think that this is why it try to use pip in order to install the required version of setuptools. the strack trace seems to config this. dh clean --buildsystem=pybuild dh_auto_clean -O--buildsystem=pybuild install -d /<>/pyfai-0.21.3\+dfsg1/debian/.debhelper/generated/_source/home pybuild --clean -i python{version} -p 3.10 I: pybuild base:240: python3.10 setup.py clean /<>/setup.py:43: DeprecationWarning: The distutils package is deprecated and slated for removal in Python 3.12. Use setuptools or check PEP 632 for potential alternatives from distutils.command.clean import clean as Clean /usr/lib/python3/dist-packages/_distutils_hack/__init__.py:18: UserWarning: Distutils was imported before Setuptools, but importing Setuptools also replaces the `distutils` module in `sys.modules`. This may lead to undesirable behaviors or errors. To avoid these issues, avoid using distutils directly, ensure that setuptools is installed in the traditional way (e.g. not an editable install), and/or make sure that setuptools is always imported before distutils. warnings.warn( /usr/lib/python3/dist-packages/_distutils_hack/__init__.py:33: UserWarning: Setuptools is replacing distutils. warnings.warn("Setuptools is replacing distutils.") INFO: Disabling color, you really want to install colorlog. INFO:pythran:Disabling color, you really want to install colorlog. INFO:pyFAI.setup:Use setuptools with cython INFO:pyFAI.setup:Use setuptools.setup /usr/lib/python3/dist-packages/setuptools/installer.py:27: SetuptoolsDeprecationWarning: setuptools.installer is deprecated. Requirements should be satisfied by a PEP 517 installer. warnings.warn( /usr/lib/python3/dist-packages/setuptools/installer.py:60: UserWarning: Unbuilt egg for pyFAI [unknown version] (/<>) environment = pkg_resources.Environment() WARNING: Retrying (Retry(total=4, connect=None, read=None, redirect=None, status=None)) after connection broken by 'ProxyError('Cannot connect to proxy.', NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused'))': /simple/setuptools/ WARNING: Retrying (Retry(total=3, connect=None, read=None, redirect=None, status=None)) after connection broken by 'ProxyError('Cannot connect to proxy.', NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused'))': /simple/setuptools/ WARNING: Retrying (Retry(total=2, connect=None, read=None, redirect=None, status=None)) after connection broken by 'ProxyError('Cannot connect to proxy.', NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused'))': /simple/setuptools/ WARNING: Retrying (Retry(total=1, connect=None, read=None, redirect=None, status=None)) after connection broken by 'ProxyError('Cannot connect to proxy.', NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused'))': /simple/setuptools/ WARNING: Retrying (Retry(total=0, connect=None, read=None, redirect=None, status=None)) after connection broken by 'ProxyError('Cannot connect to proxy.', NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused'))': /simple/setuptools/ ERROR: Could not find a version that satisfies the requirement setuptools<60.0.0 (from versions: none) ERROR: No matching distribution found for setuptools<60.0.0 Traceback (most recent call last): File "/usr/lib/python3/dist-packages/setuptools/installer.py", line 82, in fetch_build_egg subprocess.check_call(cmd) File "/usr/lib/python3.10/subprocess.py", line 369, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command '['/usr/bin/python3.10', '-m', 'pip', '--disable-pip-version-check', 'wheel', '--no-deps', '-w', '/tmp/tmpmx8yh9o_', '--quiet', 'setuptools<60.0.0']' returned non-zero exit status 1. The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/<>/setup.py", line 1137, in setup_package() File "/<>/setup.py", line 1133, in setup_package setup(**setup_kwargs) File "/usr/lib/python3/dist-packages/setuptools/__init__.py", line 86, in setup _install_setup_requires(attrs) File "/usr/lib/python3/dist-packages/setuptools/__init__.py", line 80, in _install_setup_requires dist.fetch_build_eggs(dist.setup_requires) File "/usr/lib/python3/dist-packages/setuptools/dist.py", line 875, in fetch_build_eggs resolved_dists = pkg_resources.working_set.resolve( File "/usr/lib/python3/dist-packages/pkg_resources/__init__.py", line 789, in resolve dist = best[req.key] = env.best_match( File "/usr/lib/py
Re: build package xrayutilities - wheel and pip with setuptools
Hello, I juste added PYBUILD_SYSTEM=distutils so now pybuild use the setup.py file. (at least this is a workaround :). now it failes with this error /usr/bin/python3.10: No module named pip so I am wondering if pip should not be added to the setuptools dependency ? I will add mannualy pip in the B-D and see if it works at the end. Cheers Fred dh clean --buildsystem=pybuild dh_auto_clean -O--buildsystem=pybuild install -d /<>/pyfai-0.21.3\+dfsg1/debian/.debhelper/generated/_source/home pybuild --clean -i python{version} -p 3.10 I: pybuild base:240: python3.10 setup.py clean /<>/setup.py:43: DeprecationWarning: The distutils package is deprecated and slated for removal in Python 3.12. Use setuptools or check PEP 632 for potential alternatives from distutils.command.clean import clean as Clean /usr/lib/python3/dist-packages/_distutils_hack/__init__.py:18: UserWarning: Distutils was imported before Setuptools, but importing Setuptools also replaces the `distutils` module in `sys.modules`. This may lead to undesirable behaviors or errors. To avoid these issues, avoid using distutils directly, ensure that setuptools is installed in the traditional way (e.g. not an editable install), and/or make sure that setuptools is always imported before distutils. warnings.warn( /usr/lib/python3/dist-packages/_distutils_hack/__init__.py:33: UserWarning: Setuptools is replacing distutils. warnings.warn("Setuptools is replacing distutils.") INFO: Disabling color, you really want to install colorlog. INFO:pythran:Disabling color, you really want to install colorlog. INFO:pyFAI.setup:Use setuptools with cython INFO:pyFAI.setup:Use setuptools.setup /usr/lib/python3/dist-packages/setuptools/installer.py:27: SetuptoolsDeprecationWarning: setuptools.installer is deprecated. Requirements should be satisfied by a PEP 517 installer. warnings.warn( /usr/lib/python3/dist-packages/setuptools/installer.py:34: UserWarning: Unbuilt egg for pyFAI [unknown version] (/<>) pkg_resources.get_distribution('wheel') WARNING: The wheel package is not available. /usr/lib/python3/dist-packages/setuptools/installer.py:60: UserWarning: Unbuilt egg for pyFAI [unknown version] (/<>) environment = pkg_resources.Environment() /usr/bin/python3.10: No module named pip Traceback (most recent call last): File "/usr/lib/python3/dist-packages/setuptools/installer.py", line 82, in fetch_build_egg subprocess.check_call(cmd) File "/usr/lib/python3.10/subprocess.py", line 369, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command '['/usr/bin/python3.10', '-m', 'pip', '--disable-pip-version-check', 'wheel', '--no-deps', '-w', '/tmp/tmppdpybvy1', '--quiet', 'setuptools<60.0.0']' returned non-zero exit status 1. - Mail d’origine - De: Scott Kitterman <deb...@kitterman.com> À: debian-python@lists.debian.org Envoyé: Tue, 01 Nov 2022 20:36:20 +0100 (CET) Objet: Re: build package xrayutilities - wheel and pip with setuptools On Tuesday, November 1, 2022 3:31:47 PM EDT PICCA Frederic-Emmanuel wrote: > > I don't think it should do that, so we need to investigate. Where > > can I find the updated packaging? > > I did not push the change right now, I will push once I solve this issue :). > > my opinion is that I should force via PYBUILD_SYSTEM=distutils > > Fred If that works, I think it's fine, but I don't think it should be necessary. Let me know once you push the package and I'll see if there's a pybuild issue. Scott K
Re: build package xrayutilities - wheel and pip with setuptools
> I don't think it should do that, so we need to investigate. Where can I > find > the updated packaging? I did not push the change right now, I will push once I solve this issue :). my opinion is that I should force via PYBUILD_SYSTEM=distutils Fred
Re: build package xrayutilities - wheel and pip with setuptools
>It looks to me like the current pyproject.toml file for pyfai is not >sufficient > to build the package, so I would tempted to keep what you have now. Due to the presence of this file, pybuild try to build using the "new way" instead of building with setup.py. I do not know if other package are in this state, but if produce an FTBFS. Cheers
Re: build package xrayutilities - wheel and pip with setuptools
> As far as I can see, the package doesn't ship any files in /usr/bin. > Why do > you need to build man pages (I'm assuming that's what that's > for? More > generically, what problem did that step in the process solve that's not > solved > now? this is for the pyfai package which I need to update https://salsa.debian.org/science-team/pyfai
Re: build package xrayutilities - wheel and pip with setuptools
thanks for your help. I have one more question I have this command from the previous build {interpreter} setup.py build_man how can I translate this with the new build systeme ?
Re: build package xrayutilities - wheel and pip with setuptools
> Hello Frederic, Hello Carsten > please could you provide next time direct links to the VCS/Tracker of > your package, that prevents time to search for the correct package on my > or others people side. Also a speaking subject content is helping me to > decide if I want to spend time on taking a look, you've choose a very > generic line. :-) ack > https://tracker.debian.org/pkg/xrayutilities > https://salsa.debian.org/science-team/python-xrayutilities > No as you have overwritten the default target by a call that is quite > similar to the original pybuild call. > Yes, the environment isn't the same for the second call, you'd need > to > ensure that the build directory is clean before starting the second run. > But I don't see why this construct you've build in d/rules is > needed > that way. ok > I can build the package basically doing these modifications and by > adding the additional B-D package Scott did mention. Simply let > dh_sphinxdoc build the documentation and adding the additional needed > package dependencies. [your patch] In your proposition you removed the arch/indep split, is it on purpose ? > lintian has some additional remarks any way, I haven't looked further at > these. ok Cheers Fred
ctype and dh_python3 extension rename
Hello, I am packaging the latest tomopy package.this software use ctype to open some extensions during the import, but I have this error message. >>> import tomopy Traceback (most recent call last): File "", line 1, in File "/usr/lib/python3/dist-packages/tomopy/__init__.py", line 63, in from tomopy.misc.corr import * File "/usr/lib/python3/dist-packages/tomopy/misc/corr.py", line 60, in import tomopy.util.extern as extern File "/usr/lib/python3/dist-packages/tomopy/util/extern/__init__.py", line 86, in from tomopy.util.extern.prep import * File "/usr/lib/python3/dist-packages/tomopy/util/extern/prep.py", line 65, in LIB_TOMOPY_PREP = c_shared_lib("libtomopy-prep") File "/usr/lib/python3/dist-packages/tomopy/util/extern/__init__.py", line 69, in c_shared_lib raise ModuleNotFoundError( ModuleNotFoundError: The following shared library is missing: /usr/lib/python3/dist-packages/tomopy/util/extern/libtomopy-prep.soIndeed during the build process this module was renamed into libtomopy-prep.cpython-310-x86_64-linux-gnu.soSo my question is... how can I solve this issue. - do not rename the extension - change the logic in order to find the renamed extension. (I like it but where can I find the debian modules helper which allows to do this).Thanks for your helpFrederic
Re: dxtbx pytest issue
For the record, I found it..., the upstream modify the HDF5_PLUGIN_PATH when loading the dxtbx module. they guess that they are using conda and override the path. All this is useless on Debian since the plugin are system installed properly. Cheers Fred # Ensures that HDF5 has the conda_base plugin path configured. # # Ideally this will be properly configured by the conda environment. # However, currently the dials-installer will not install a path-correct # conda_base folder, so it needs to be updated manually. _hdf5_plugin_path = libtbx.env.under_base(os.path.join("lib", "hdf5", "plugin")) # Inject via the environment if h5py not used yet, or else use h5py if "h5py" not in sys.modules: os.environ["HDF5_PLUGIN_PATH"] = ( _hdf5_plugin_path + os.pathsep + os.getenv("HDF5_PLUGIN_PATH", "") ) else: # We've already loaded h5py, so setting the environment variable won't work # We need to use the h5py API to add a plugin path import h5py h5_plugin_paths = [h5py.h5pl.get(i).decode() for i in range(h5py.h5pl.size())] if _hdf5_plugin_path not in h5_plugin_paths: h5py.h5pl.prepend(_hdf5_plugin_path.encode())
Re: dxtbx pytest issue
Hello Neil > Looks like you need a -v option to see more detail. thanks for the advices, I found by removing files one by one that the failling behavious is due to the import of the library itself. the failing test PASS by himself, but if I add a useless import dxtbx inside, it failes. so there is some interaction between the dxtbx package and the h5py library, I will dig it. Cheers Fred
dxtbx pytest issue
Hello, I am still trying to package[2] dxtbx[1], and now I end up with something strange. When I run the test suite during the build I have a failure like this tests/test_dataset_as_flex.py ..F..F..F [ 2%] I put the error message bellow, it is quite long now If I execute by hand only the failing test like this, it works $ pytest-3 tests/test_dataset_as_flex.py = test session starts = platform linux -- Python 3.10.6, pytest-7.1.2, pluggy-1.0.0+repack rootdir: /home/picca/debian/science-team/dxtbx, configfile: pytest.ini plugins: requests-mock-1.9.3, forked-1.4.0, xdist-2.5.0, mock-3.8.2, dials-data-2.4.0 collected 9 items tests/test_dataset_as_flex.py . [100%] == 9 passed in 0.61s == before investigating further, I would like your advice in order to debug this sort of issue. first what is the difference between pytest and pytest thanks for your help Frederic [1] https://github.com/cctbx/dxtbx [2] https://salsa.debian.org/science-team/dxtbx full error message ___ test_dataset_as_flex[int-dataset_as_flex_int-bshuf_lz4] type_name = 'int', creator = converter = @pytest.mark.parametrize( "creator", [ uncompressed, gzip, bshuf_lz4, ], ) @pytest.mark.parametrize( "type_name,converter", [ ("int", dataset_as_flex_int), ("float", dataset_as_flex_float), ("double", dataset_as_flex_double), ], ) def test_dataset_as_flex(type_name, creator, converter): # Create an in-memory HDF5 dataset with unique name f = h5py.File(type_name + ".h5", "w", driver="core", backing_store=False) shape = (20, 20, 20) > dataset = creator(f, shape, type_name) tests/test_dataset_as_flex.py:64: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ tests/test_dataset_as_flex.py:34: in bshuf_lz4 return file.create_dataset( /usr/lib/python3/dist-packages/h5py/_debian_h5py_serial/_hl/group.py:161: in create_dataset dsid = dataset.make_new_dset(group, shape, dtype, data, name, **kwds) /usr/lib/python3/dist-packages/h5py/_debian_h5py_serial/_hl/dataset.py:106: in make_new_dset dcpl = filters.fill_dcpl( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ plist = shape = (20, 20, 20), dtype = dtype('int64'), chunks = (10, 10, 10) compression = 32008, compression_opts = (0, 2), shuffle = None fletcher32 = None, maxshape = None, scaleoffset = None, external = [] allow_unknown_filter = False def fill_dcpl(plist, shape, dtype, chunks, compression, compression_opts, shuffle, fletcher32, maxshape, scaleoffset, external, allow_unknown_filter=False): """ Generate a dataset creation property list. Undocumented and subject to change without warning. """ if shape is None or shape == (): shapetype = 'Empty' if shape is None else 'Scalar' if any((chunks, compression, compression_opts, shuffle, fletcher32, scaleoffset is not None)): raise TypeError( f"{shapetype} datasets don't support chunk/filter options" ) if maxshape and maxshape != (): raise TypeError(f"{shapetype} datasets cannot be extended") return h5p.create(h5p.DATASET_CREATE) def rq_tuple(tpl, name): """ Check if chunks/maxshape match dataset rank """ if tpl in (None, True): return try: tpl = tuple(tpl) except TypeError: raise TypeError('"%s" argument must be None or a sequence object' % name) if len(tpl) != len(shape): raise ValueError('"%s" must have same rank as dataset shape' % name) rq_tuple(chunks, 'chunks
Re: PYTHONPATH with cmake build
Hello Stephano I end up with this #! /usr/bin/make -f export DH_VERSOBE=1 export PYBUILD_NAME=dxtbx export PYBUILD_SYSTEM=distutils export PYBUILD_AFTER_CONFIGURE=cmake -DPython_EXECUTABLE=/usr/bin/{interpreter} -S . -B {build_dir}/lib export PYBUILD_AFTER_BUILD=make -C {build_dir}/lib export PYBUILD_BEFORE_TEST=cp {build_dir}/lib/lib/*.so {build_dir} export PYBUILD_BEFORE_INSTALL=cp {build_dir}/lib/lib/*.so {build_dir}; rm -rf {build_dir}/lib %: dh $@ --with python3 --buildsystem=pybuild override_dh_auto_test: it seem to work but I am am not really confident with the 'rm -rf {build_dir}/lib' during the install If I do not remove these files, they got installed by pybuild. no more magical obj-$blabla path :) I now can concentrate on the unit test failures... Cheers Fred
Re: PYTHONPATH with cmake build
> Oh. In this case setting PYTHONPATH (if it works, and I'm not 100% sure > it > will) sounds like a better option. > So, the cmake build path is, as far as I know, defined in > /usr/share/perl5/Debian/Debhelper/Buildsystem.pm to be just > obj-$DEB_HOST_GNU_TYPE". thanks for the info. It seems to me that the builddir can be set directly in the ruels file with --builddir= So do you know a way to extract the real builddir during the build process ? Cheers
Re: PYTHONPATH with cmake build
> /usr/share/perl5/Debian/Debhelper/Buildsystem.pm to be just > "obj-$DEB_HOST_GNU_TYPE". Thanks for the info, if I an not wrong during the build process we can setup a new builddir. So is is possible to obtain the real builddir during the build ?
Re: PYTHONPATH with cmake build
> When trying to run tests you should look how does the upstream intend to run them. Yes they are building inplace the module like this from https://github.com/dials/dxtbx/blob/main/.azure-pipelines/unix-build.yml # Build dxtbx - bash: | set -e . conda_base/bin/activate set -ux mkdir build cd build cmake ../modules/dxtbx cmake --build . --target install pip install ../modules/dxtbx displayName: Build dxtbx workingDirectory: $(Pipeline.Workspace) then # Finally, run the full regression test suite - bash: | set -e . conda_base/bin/activate set -ux export DIALS_DATA=${PWD}/data cd modules/dxtbx export PYTHONDEVMODE=1 pytest -v -ra -n auto --basetemp="$(Pipeline.Workspace)/tests" --durations=10 \ --cov=dxtbx --cov-report=html --cov-report=xml --cov-branch \ --timeout=5400 --regression || echo "##vso[task.complete result=Failed;]Some tests failed" displayName: Run tests workingDirectory: $(Pipeline.Workspace) -- WBR, wRAR
Re: PYTHONPATH with cmake build
Hello Andrey > Does the same happen when you run the test in the source tree manually? I do not know, I am in the process to build the package in sbuild so I just try to fix the build process. If I have hard code the lib path the test are running failling for other missing modules, but this is ok. so my question is more how can I obtain the cmake default build path from the rules file ? cheers
PYTHONPATH with cmake build
Hello, I am packaging a python extension which use cmake as build system here the repo https://salsa.debian.org/science-team/dxtbx I try to activate the test but this cause this kind of trouble During handling of the above exception, another exception occurred: /usr/lib/python3.10/importlib/__init__.py:126: in import_module return _bootstrap._gcd_import(name[level:], package, level) tests/serialize/test_xds.py:8: in from dxtbx.imageset import ImageSetFactory dxtbx/imageset.py:5: in import dxtbx.format.image # noqa: F401, import dependency for unpickling dxtbx/format/image.py:8: in from dxtbx_format_image_ext import * # type: ignore # isort:skip # noqa: F403 E ModuleNotFoundError: No module named 'dxtbx_format_image_ext' Indeed the extension are build here. cd /<>/obj-x86_64-linux-gnu/src/dxtbx && /usr/bin/cmake -E cmake_link_script CMakeFiles/dxtbx_flumpy.dir/link.txt --verbose=1 /usr/bin/c++ -fPIC -g -O2 -ffile-prefix-map=/<>=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -flto -Wl,-z,relro -shared -o ../../lib/dxtbx_flumpy.cpython-310-x86_64-linux-gnu.so CMakeFiles/dxtbx_flumpy.dir/boost_python/flumpy.cc.o /usr/lib/x86_64-linux-gnu/libboost_python310.so.1.74.0 lto-wrapper: warning: using serial compilation of 4 LTRANS jobs lto-wrapper: note: see the ‘-flto’ option documentation for more information [ 90%] Linking CXX shared module ../../lib/dxtbx_format_image_ext.so so I need to add the /<>/obj-x86_64-linux-gnu/lib directory to the PYTHONPATH. do you know how to do this ? thanks for your help Fred
howto strip an install_requires with pybuild
Hello, I am working on the silx package and the upstream install_requires is sort of wrong. It depends on the hdf5plugin, which is not necessary on Debian. the purpose of this hdf5plugin is to register an hdf5 pluging when it is uploaded. the code of the application use a try execpt in order to work without this package installer. the upstream want the full install to be contain the hdf5plugin. On Debian we just install the libhdf5-plugin- packages and that's it. So my question is do we have a way to STRIP an install_requires via pybuild (like dpkg-buildflags) or must I patch the build system and keep this patch forever :) the real problem is due to the entry_point generated which refuse to start the applciation due to the missing hdf5plugin. Cheers Frederic
RE:How to install older python version on Debian
> Hello, > I’d suggest you build it from source (python.org/ftp... with the needed > version) as an additional python version, and then create your venv using the > 3.6. > You can dm me if you might need more details. It would be great to have a python-builder package whcih generates a pythonX.Y package from the upstream sources. or maybe so,ething like python-buildpackage Python2.7-src.tar.gz -> python3.6 pacakge
embedded-javascript-library usr/share/doc/python3-lmfit/html/_static/language_data.js please use sphinx
Hello, I am working on the lmfit-py package lintian complain about this https://salsa.debian.org/science-team/lmfit-py/-/jobs/909498 I use sphinx, so my question is: do you know how to fix this issue lot's of package are affected by this https://packages.debian.org/search?searchon=contents&keywords=language_data.js&mode=path&suite=stable&arch=any cheers Frederic
RE:Offer to help with packaging
What about helping https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=946035 python-language-server is the only one missing now. cheers
RE:pkg-config of python3
> If you want to embed python in an application, you need to use > python3-embed.pc > Or python3-config --embed then it links the program with -lpython3.8 so what is the purpose of python3.pc ? thanks Fred
pkg-config of python3
Hello , I am writing a program in haskell which use Python3 It failes with Linking dist/build/binoculars/binoculars ... src/Hkl/Python/Pyfi.hs:124 : erreur : référence à « Py_BuildValue » non définie src/Hkl/Python/Pyfi.hs:199 : erreur : référence à « PyUnicode_AsUTF8 » non définie src/Hkl/Python/Pyfi.hs:261 : erreur : référence à « PyObject_CallObject » non définie src/Hkl/Python/Pyfi.hs:207 : erreur : référence à « Py_Initialize » non définie src/Hkl/Python/Pyfi.hs:254 : erreur : référence à « PyObject_CallObject » non définie src/Hkl/Python/Pyfi.hs:270 : erreur : référence à « Py_BuildValue » non définie src/Hkl/Python/Pyfi.hs:278 : erreur : référence à « Py_BuildValue » non définie src/Hkl/Python/Pyfi.hs:288 : erreur : référence à « Py_BuildValue » non définie /usr/include/python3.8/numpy/__multiarray_api.h:1466 : erreur : référence à « PyImport_ImportModule » non définie /usr/include/python3.8/numpy/__multiarray_api.h:1472 : erreur : référence à « PyObject_GetAttrString » non définie /usr/include/python3.8/numpy/__multiarray_api.h:1480 : erreur : référence à « PyCapsule_Type » non définie /usr/include/python3.8/numpy/__multiarray_api.h:1485 : erreur : référence à « PyCapsule_GetPointer » non définie /usr/include/python3.8/object.h:478 : erreur : référence à « _Py_Dealloc » non définie /usr/include/python3.8/object.h:478 : erreur : référence à « _Py_Dealloc » non définie /usr/include/python3.8/numpy/__multiarray_api.h:1508 : erreur : référence à « PyExc_RuntimeError » non définie /usr/include/python3.8/numpy/__multiarray_api.h:1508 : erreur : référence à « PyErr_Format » non définie /usr/include/python3.8/numpy/__multiarray_api.h:1481 : erreur : référence à « PyExc_RuntimeError » non définie /usr/include/python3.8/numpy/__multiarray_api.h:1481 : erreur : référence à « PyErr_SetString » non définie /usr/include/python3.8/object.h:478 : erreur : référence à « _Py_Dealloc » non définie /usr/include/python3.8/numpy/__multiarray_api.h:1531 : erreur : référence à « PyExc_RuntimeError » non définie /usr/include/python3.8/numpy/__multiarray_api.h:1531 : erreur : référence à « PyErr_Format » non définie /usr/include/python3.8/numpy/__multiarray_api.h:1475 : erreur : référence à « PyExc_AttributeError » non définie /usr/include/python3.8/numpy/__multiarray_api.h:1475 : erreur : référence à « PyErr_SetString » non définie /usr/include/python3.8/numpy/__multiarray_api.h:1496 : erreur : référence à « PyExc_RuntimeError » non définie /usr/include/python3.8/numpy/__multiarray_api.h:1496 : erreur : référence à « PyErr_SetString » non définie src/Hkl/Python/cdefs.c:19 : erreur : référence à « PyImport_AddModule » non définie src/Hkl/Python/cdefs.c:20 : erreur : référence à « PyModule_GetDict » non définie src/Hkl/Python/cdefs.c:21 : erreur : référence à « PyDict_GetItemString » non définie src/Hkl/Python/cdefs.c:22 : erreur : référence à « PyErr_Occurred » non définie src/Hkl/Python/cdefs.c:23 : erreur : référence à « PyErr_Print » non définie src/Hkl/Python/cdefs.c:23 : erreur : référence à « PyErr_Clear » non définie src/Hkl/Python/cdefs.c:32 : erreur : référence à « PyImport_AddModule » non définie src/Hkl/Python/cdefs.c:33 : erreur : référence à « PyModule_GetDict » non définie src/Hkl/Python/cdefs.c:34 : erreur : référence à « PyDict_GetItemString » non définie src/Hkl/Python/cdefs.c:35 : erreur : référence à « PyErr_Occurred » non définie src/Hkl/Python/cdefs.c:36 : erreur : référence à « PyErr_Print » non définie src/Hkl/Python/cdefs.c:36 : erreur : référence à « PyErr_Clear » non définie src/Hkl/Python/cdefs.c:44 : erreur : référence à « PyErr_Occurred » non définie src/Hkl/Python/cdefs.c:46 : erreur : référence à « PyErr_Fetch » non définie src/Hkl/Python/cdefs.c:47 : erreur : référence à « PyObject_Str » non définie src/Hkl/Python/cdefs.c:48 : erreur : référence à « PyObject_Str » non définie src/Hkl/Python/cdefs.c:49 : erreur : référence à « PyUnicode_AsUTF8 » non définie src/Hkl/Python/cdefs.c:50 : erreur : référence à « PyUnicode_AsUTF8 » non définie /usr/include/python3.8/object.h:478 : erreur : référence à « _Py_Dealloc » non définie src/Hkl/Python/cdefs.c:68 : erreur : référence à « PyImport_AddModule » non définie src/Hkl/Python/cdefs.c:69 : erreur : référence à « PyModule_GetDict » non définie src/Hkl/Python/cdefs.c:70 : erreur : référence à « PyEval_GetBuiltins » non définie src/Hkl/Python/cdefs.c:70 : erreur : référence à « PyDict_SetItemString » non définie src/Hkl/Python/cdefs.c:71 : erreur : référence à « _Py_BuildValue_SizeT » non définie src/Hkl/Python/cdefs.c:72 : erreur : référence à « PyRun_StringFlags » non définie src/Hkl/Python/cdefs.c:78 : erreur : référence à « PyErr_Occurred » non définie src/Hkl/Python/cdefs.c:78 : erreur : référence à « PyErr_Print » non définie src/Hkl/Python/cdefs.c:99 : erreur : référence à « PyType_IsSubtype » non définie src/Hkl/Python/cdefs.c:110 : erreur : référence à « PyType_IsSubtype » non définie src/Hkl/Python/cdefs.
RE:pybuild and setup.py in unusual place
I found this --sourcedirectory=src is it equivalent to -D subsidiary question is it possible to run a command before all the dh_auto_xxx without overrideing eveythings ? I need to run a command whcih generate the setup.py file so I need to do override_dh_auto_: do_something dh_auto_xxx cheers Frederic
pybuild and setup.py in unusual place
Hello, I have a packahe where the setup.py is not located at the root of the directory. So I need to do override_dh_auto_XXX: dh_auto_XXX -- -d Is there a export somthing whcih allows to says where is the setup.py to deal with ? thanks Frederic
RE:where should we put private libraries
> You should consider /usr/lib// if you want to make your > package multiarch-safe. And what about ? /usr/lib// whcih one is better ?
RE:where should we put private libraries
> > The issue is that the current build system do not provide rpath for > > these libraries so I can not add one via chrpath. > Well, ideally you need to fix the build system so that it sets the correct > rpath directly. I found patchelf whcih allows to add a rpath :)) So I just need to set the rpat for all the extensions and voila :) Cheers Frederic
RE:Automatically removing "badges" pictures from README.rst files
> Lintian-brush is a fine tool, but (correct me if I am wrong) it would > generate a patch excluding badges, and patches require maintenance. You are right maybe we should have a dh_privacy helper for this purpose. cheers Fred
RE:Automatically removing "badges" pictures from README.rst files
what about lintian brush ?
pytest and core dump
Hello, I am working on the vitables package. during the build I get this error message [1] = test session starts == 1567 platform linux -- Python 3.7.6, pytest-4.6.9, py-1.8.1, pluggy-0.13.0 1568 rootdir: /builds/science-team/vitables/debian/output/vitables-3.0.2 1569 collected 54 items 1570 tests/test_filenode.py Aborted (core dumped) 1571 E: pybuild pybuild:341: test: plugin distutils failed with: e This core dump is generated in the salsa-ci runners. I would like a backtrace of this core dump. so my question is how can we instrument the d/rules in order to print a stack trace during the build in case of core dump. It should help the debuging. thanks for your help Frederic [1] https://salsa.debian.org/science-team/vitables/-/jobs/554052
RE:application and private module extension
Indded sorry for the noise Fredric
application and private module extension
Hello, I am packaging a python application . So I dediced to put the module under the private directory /usr/share/ but this software contain a cython extension. So at the end I have a lintian Error due to binary file under /usr/share. What is the best soltuion when we need to package a softare with this kind of extension. thanks for your time Frederic
RE:pyqt5 and entrypoint
Hello on unstable it works but on buster (and we need to make it work for buster) we have this message Python 3.7.3 (default, Apr 3 2019, 05:39:12) [GCC 8.3.0] on linux Type "help", "copyright", "credits" or "license" for more information. >>> import pkg_resources >>> pkg_resources.get_distribution('PyQt5') Traceback (most recent call last): File "", line 1, in File "/usr/lib/python3/dist-packages/pkg_resources/__init__.py", line 481, in get_distribution dist = get_provider(dist) File "/usr/lib/python3/dist-packages/pkg_resources/__init__.py", line 357, in get_provider return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0] File "/usr/lib/python3/dist-packages/pkg_resources/__init__.py", line 900, in require needed = self.resolve(parse_requirements(requirements)) File "/usr/lib/python3/dist-packages/pkg_resources/__init__.py", line 786, in resolve raise DistributionNotFound(req, requirers) pkg_resources.DistributionNotFound: The 'PyQt5' distribution was not found and is required by the application
pyqt5 and entrypoint
Hello, We are working on the next pyfai package. This new version use entry_points like this gui_requires = ['PyQt5', 'h5py', 'hdf5plugin', 'PyOpenGL'] opencl_requires = ['pyopencl'] extras_require = { 'calib2': gui_requires, # Keep compatibility 'gui': gui_requires, 'opencl': opencl_requires, 'full': gui_requires + opencl_requires, } console_scripts = [ 'check_calib = pyFAI.app.check_calib:main', 'detector2nexus = pyFAI.app.detector2nexus:main', 'diff_map = pyFAI.app.diff_map:main', 'diff_tomo = pyFAI.app.diff_tomo:main', 'eiger-mask = pyFAI.app.eiger_mask:main', 'MX-calibrate = pyFAI.app.mx_calibrate:main', 'pyFAI-average = pyFAI.app.average:main', 'pyFAI-benchmark = pyFAI.app.benchmark:main', 'pyFAI-calib = pyFAI.app.calib:main', 'pyFAI-calib2 = pyFAI.app.calib2:main [gui]', 'pyFAI-drawmask = pyFAI.app.drawmask:main', 'pyFAI-diffmap = pyFAI.app.diff_map:main', 'pyFAI-integrate = pyFAI.app.integrate:main', 'pyFAI-recalib = pyFAI.app.recalib:main', 'pyFAI-saxs = pyFAI.app.saxs:main', 'pyFAI-waxs = pyFAI.app.waxs:main', ] entry_points = { 'console_scripts': console_scripts, # 'gui_scripts': [], } But once installed, we can not start the gui application. jerome@patagonia:~/workspace/hdf5plugin$ pyFAI-calib2 Traceback (most recent call last): File "/usr/bin/pyFAI-calib2", line 11, in load_entry_point('pyFAI==0.19.0', 'console_scripts', 'pyFAI-calib2')() File "/usr/lib/python3/dist-packages/pkg_resources/__init__.py", line 489, in load_entry_point return get_distribution(dist).load_entry_point(group, name) File "/usr/lib/python3/dist-packages/pkg_resources/__init__.py", line 2793, in load_entry_point return ep.load() File "/usr/lib/python3/dist-packages/pkg_resources/__init__.py", line 2410, in load self.require(*args, **kwargs) File "/usr/lib/python3/dist-packages/pkg_resources/__init__.py", line 2433, in require items = working_set.resolve(reqs, env, installer, extras=self.extras) File "/usr/lib/python3/dist-packages/pkg_resources/__init__.py", line 786, in resolve raise DistributionNotFound(req, requirers) pkg_resources.DistributionNotFound: The 'PyQt5' distribution was not found and is required by the application indeed python3-pyqt5 was installed so Is it a bug in the pyqt5 package, or is there a way to fix this in Debian. thanks for your help Frederic
RE:Streamlining the use of Salsa CI on team packages
Hello, and if at the end the upstream could take care of the Debian packaging, by adding a .salsa-ci.yml in the upstream directory, in order to have a feedback with nice badges ? Cheers
dh-python does not stop if there is error during the byte-compiling
Hello, in one of my package (pymca), there is a syntax error like this. byte-compiling /builds/science-team/pymca/debian/output/pymca-5.5.2+dfsg/debian/python-pymca5/usr/lib/python2.7/dist-packages/PyMca5/Object3D/Object3DPlugins/ChimeraStack.py to ChimeraStack.pyc File "/usr/lib/python2.7/dist-packages/PyMca5/Object3D/Object3DPlugins/ChimeraStack.py", line 72 with h5py.File(filename, mode='r') as f ^ SyntaxError: invalid syntax byte-compiling /builds/science-team/pymca (missing ':' at the end of the line) but this does not stop the build process. Is it normal ? Cheers Frederic
python equivalent of cabal-debian
Hello, I would like to know if there is an equivalent of cabal-debian[1] , which helps a lot maintaining haskell packages. I allow to do the initial packaging and also the upgrade of the packaging, by updating the Build-dependencies, etc... Cheers Frederic [1] https://hackage.haskell.org/package/cabal-debian
RE:Webpage to track py2removal bugs & packages
Hello sandro, I can not find python-pyqtgraph in your list. It seems to me that this package has reverse dependencies, but the python2 binaries where remove..., but this is another problem. Cheers Frederic
RE:dh-python and pylint
Hello Sandro, > I've just submitted > https://salsa.debian.org/python-team/tools/dh-python/merge_requests/9 > to address this; not sure how quickly it will get merged & released thanks a lot, but what about backports. On backports we still need this mapping. Cheers
dh-python and pylint
Hello, I am preparing the new spyder package. since the removal of pylint3 from the src:pylint. I need to remove the Build-Depends: pylint3. Now dh_python3 still produce a pylint3 dependency for the binary packages. It seems that thsi is hard coded here[0] Is it a bug of dh-python ? Cheers Frederic [0] https://salsa.debian.org/python-team/tools/dh-python/blob/master/pydist/cpython3_fallback#L2047
RE:pylint and pylint3
Hello, > The Python 3 variant of the package should begin provide the > /usr/bin/foo wrapper script interface when the Python 2 package is > dropped. > This really ought to be codified somewhere, and I'm not sure that the > DPMT wiki page is visible enough or will be consulted by maintainers > when dropping the Python 2 variant of their packages. Should I understand that this information about the wrapper is available on the DPMT wiki ? Fred
pylint and pylint3
Hello it seems that the pylint package does not provide pylint3 anymore (since 21h ;) But the spyder package still require pylint3 and pylint when installint spyder or spyder3. this is why the next taurus will FTBFS. So is it something expected and the spyder package should be fixed, or a bug in the pylint package ? Cheers Fred
RE:dropping python2 [was Re: scientific python stack transitions]
it would be nice, if the python2 packages could be skipped in bulleye but not for backports in buster. Is it something which could be envision ? Fred
dh_python2 namespace issue ?
Hello, while preparing one of our package (pyfai), we endup with an FTBFS due to backports.functools_lru_cache [1]. here the backtrace Traceback (most recent call last): File "./run_tests.py", line 543, in unittest.defaultTestLoader.loadTestsFromNames(options.test_name)) File "/usr/lib/python2.7/unittest/loader.py", line 130, in loadTestsFromNames suites = [self.loadTestsFromName(name, module) for name in names] File "/usr/lib/python2.7/unittest/loader.py", line 115, in loadTestsFromName test = obj() File "/<>/pyfai-0.17.0+dfsg1/.pybuild/cpython2_2.7_pyfai/build/pyFAI/test/__init__.py", line 50, in suite from . import test_all File "/<>/pyfai-0.17.0+dfsg1/.pybuild/cpython2_2.7_pyfai/build/pyFAI/test/test_all.py", line 50, in from . import test_peak_picking File "/<>/pyfai-0.17.0+dfsg1/.pybuild/cpython2_2.7_pyfai/build/pyFAI/test/test_peak_picking.py", line 48, in from ..gui.peak_picker import PeakPicker File "/<>/pyfai-0.17.0+dfsg1/.pybuild/cpython2_2.7_pyfai/build/pyFAI/gui/peak_picker.py", line 59, in from .utils import update_fig, maximize_fig File "/<>/pyfai-0.17.0+dfsg1/.pybuild/cpython2_2.7_pyfai/build/pyFAI/gui/utils/__init__.py", line 41, in from .. import matplotlib File "/<>/pyfai-0.17.0+dfsg1/.pybuild/cpython2_2.7_pyfai/build/pyFAI/gui/matplotlib.py", line 48, in import matplotlib File "/usr/lib/python2.7/dist-packages/matplotlib/__init__.py", line 130, in from matplotlib.rcsetup import defaultParams, validate_backend, cycler File "/usr/lib/python2.7/dist-packages/matplotlib/rcsetup.py", line 29, in from matplotlib.fontconfig_pattern import parse_fontconfig_pattern File "/usr/lib/python2.7/dist-packages/matplotlib/fontconfig_pattern.py", line 28, in from backports.functools_lru_cache import lru_cache ImportError: No module named backports.functools_lru_cache so we tried in a clean chroot, a simple import backports.functools_lru_cache which end-up with ~# python -c "import backports.functools_lru_cache; print backports.functools_lru_cache" Traceback (most recent call last): File "", line 1, in ImportError: No module named backports.functools_lru_cache If we add an __init__.py file under (unstable-amd64-sbuild)root@diffabs6:/usr/lib/python2.7/dist-packages/backports# touch __init__.py then it works # python -c "import backports.functools_lru_cache; print backports.functools_lru_cache" So the question is: who is in charge of this __init__.py file ? if I look in the rules[2] file of backports.fucntools... package we can see this: override_dh_auto_install: dh_auto_install rm -rf $(CURDIR)/debian/*/usr/lib/python2.7/dist-packages/backports/__init__.py override_dh_python2: dh_python2 --namespace backports the changelog[3][4] is even more explicite, explaining that dh_python2 is now in charge of the backports namespace. Is there a bug in the dh_python2 namespace machinery or in the backports.func... package ? thanks for your attention. Frédéric [1] https://tracker.debian.org/pkg/backports.functools-lru-cache [2] https://tracker.debian.org/media/packages/b/backports.functools-lru-cache/rules-1.5-1 [3] https://tracker.debian.org/media/packages/b/backports.functools-lru-cache/changelog-1.5-1 [4] https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=884690
RE:TypeError: ord() expected a character, but string of length 3 found (Was: Updated python-uncertainties)
Hello, here a diff between the python3.6 and python3.7 modules once updated via 2to3. r:/tmp$ diff core* 174c174 < for (variance, tag) in zip(variances, tags)) --- > for (variance, tag) in list(zip(variances, tags))) 181c181 < for (coords, value) in zip(transform, nom_values)) --- > for (coords, value) in list(zip(transform, nom_values))) 946c946 < ord(sup): normal for (normal, sup) in TO_SUPERSCRIPT.items()} --- > ord(sup): normal for (normal, sup) in iter(TO_SUPERSCRIPT.items())} 1776c1776 < delta**2 for delta in self.error_components().values( --- > delta**2 for delta in iter(self.error_components().values() can python expert explain if this seems correct or not ? thanks for your help. Frederic
RE:TypeError: ord() expected a character, but string of length 3 found (Was: Updated python-uncertainties)
Hello Andreas, > Patches are welcome (I have no idea what the construct is doing neither > how to replace it with something valid). > Patch welcome as well - preferably as commit to Git. done but now, we need to understand why lintian complain about python module at the wrong place before uploading. I amnot sure thet the test run as installed... Fred
RE:TypeError: ord() expected a character, but string of length 3 found (Was: Updated python-uncertainties)
I think that the real problem with the current build is that the conf.py file change the sys.path. this is why we see this syntax eror. sphinx pick the wrong path. I can not work on this now..., I am not in fron of a Debian box nor have access to one todays... Cheers Fred
RE:TypeError: ord() expected a character, but string of length 3 found (Was: Updated python-uncertainties)
I found in the code a string with a ur'' This is the problematic line. I do not know if this is a valid string construction. I also dound that you need to remove the sys.path modifications from the conf.py. this can cause some troubles during the build. Cheers. Fred
RE:TypeError: ord() expected a character, but string of length 3 found (Was: Updated python-uncertainties)
I found the culprite, the conf.py file of the documentations prepend ".." to sys.path befor importing the module. This is why it use the wrong version of the built module. Now during the build I have this D: dh_python3 dh_python3:164: args: [] D: dh_python3 dh_python3:166: supported Python versions: 3.6,3.7 (default=3.6) D: dh_python3 debhelper:107: skipping package: python-uncertainties D: dh_python3 debhelper:153: source=uncertainties, binary packages=['python3-uncertainties'] D: dh_python3 dh_python3:183: processing package python3-uncertainties... D: dh_python3 fs:49: moving files from debian/python3-uncertainties/usr/lib/python3.6/dist-packages to debian/python3-uncertainties/usr/lib/python3/dist-packages/ D: dh_python3 fs:49: moving files from debian/python3-uncertainties/usr/lib/python3.7/dist-packages to debian/python3-uncertainties/usr/lib/python3/dist-packages/ W: dh_python3 fs:98: Paths differ: debian/python3-uncertainties/usr/lib/python3.7/dist-packages/uncertainties/core.py and debian/python3-uncertainties/usr/lib/python3/dist-packages/uncertainties/core.py W: dh_python3 fs:98: Paths differ: debian/python3-uncertainties/usr/lib/python3.7/dist-packages/uncertainties/lib1to2/test_1to2.py and debian/python3-uncertainties/usr/lib/python3/dist-packages/uncertainties/lib1to2/test_1to2.py W: dh_python3 fs:98: Paths differ: debian/python3-uncertainties/usr/lib/python3.7/dist-packages/uncertainties/umath_core.py and debian/python3-uncertainties/usr/lib/python3/dist-packages/uncertainties/umath_core.py D: dh_python3 fs:246: package python3-uncertainties details = {'requires.txt': {'debian/python3-uncertainties/usr/lib/python3/dist-packages/uncertainties-3.0.2.egg-info/requires.txt'}, 'egg-info': set(), 'nsp.txt': set(), 'shebangs': set(), 'public_vers': {Version('3'), Version('3.7')}, 'private_dirs': {}, 'compile': True, 'ext_vers': set(), 'ext_no_version': set()} I am worried by the dh_python3 warnings. It seems that the 2to3 files generated by python3.6 and python3.7 differ. I am right ?, and then what is the right way to solve this issue ? thanks Frederic
RE:TypeError: ord() expected a character, but string of length 3 found (Was: Updated python-uncertainties)
Now the test part :)) Correlated variables. ... ok Tests the input of correlated value. ... ok == ERROR: Failure: ImportError (No module named tests.support) -- Traceback (most recent call last): File "/usr/lib/python2.7/dist-packages/nose/loader.py", line 418, in loadTestsFromName addr.filename, addr.module) File "/usr/lib/python2.7/dist-packages/nose/importer.py", line 47, in importFromPath return self.importFromDir(dir_path, fqname) File "/usr/lib/python2.7/dist-packages/nose/importer.py", line 94, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "/<>/.pybuild/cpython2_2.7_uncertainties/build/uncertainties/lib1to2/test_1to2.py", line 42, in import lib2to3.tests.support as support ImportError: No module named tests.support do you know which package provide this test.suport module in lib2to3 ? thanks Frederic
RE:TypeError: ord() expected a character, but string of length 3 found (Was: Updated python-uncertainties)
> try adding python3-setuptools to Build-Depends ok I removed all the black magic from the debian/rules and added setuptools :) so now, I have this error when building the documentation PYTHONPATH=`pybuild --print build_dir --interpreter python3` http_proxy='http://127.0.0.1:9/' sphinx-build -N -bhtml doc debian/html Running Sphinx v1.7.9 Configuration error: There is a programable error in your configuration file: Traceback (most recent call last): File "/usr/lib/python3/dist-packages/sphinx/config.py", line 161, in __init__ execfile_(filename, config) File "/usr/lib/python3/dist-packages/sphinx/util/pycompat.py", line 150, in execfile_ exec_(code, _globals) File "conf.py", line 19, in import uncertainties File "/<>/uncertainties/__init__.py", line 224, in from .core import * File "/<>/uncertainties/core.py", line 946, in ord(sup): normal for (normal, sup) in TO_SUPERSCRIPT.iteritems()} AttributeError: 'dict' object has no attribute 'iteritems' As we can see sphinx-build even if we gives the PYTHONPATH, load the modules from the source directory File "/<>/uncertainties/core.py", line 946, in and not from /<>/.pybuild/cpython3_3.6_uncertainties/build/ which should contain the module updated with 2to3. what is the proper way to build sphinx doc during a build process ? Cheers Frederic
RE:TypeError: ord() expected a character, but string of length 3 found (Was: Updated python-uncertainties)
You are right , I did not noticed, that setuptools was not part of the build dependencies...
RE:TypeError: ord() expected a character, but string of length 3 found (Was: Updated python-uncertainties)
Hello Andreas, it seems to me that the problem is due to the 2to3 conversion. I looked at the first failure when you re-activate the unit test[1] to my opinion, the code is modify in place with 2to3. So the code on the source after the configuration is already converted to python3. And during the build process, with python2 the code is copied as it in the .pybuild place for python2 with python3 we see that the RefactoringTool, does nothing, it means the code is already converted to python3. So I guess that the pythoncode used for python2 is already the code modified for python3... this is why the code failed for the unit test. [1] https://salsa.debian.org/debian/python-uncertainties/-/jobs/53314
RE:TypeError: ord() expected a character, but string of length 3 found (Was: Updated python-uncertainties)
Hello Andreas, during the test does it load the moduels from the source files or does it use the one under the build directory. Maybe there is a missmatch between python2 code and 2to3 python code targeting python3. did it helped ? Fred
cffi issue ? was RE:python3.7-dbg issue ?
I think that there is a problem with cffi pyopencl was built with python3-cffi-backend i386 1.11.5-1 [80.2 kB] but the backend used for the test is the current 1.11.5-3. here the Debian changelog python-cffi (1.11.5-3) unstable; urgency=medium [ Ondřej Nový ] * Use 'python3 -m sphinx' instead of sphinx-build for building docs [ Stefano Rivera ] * Patch skip-double-float-int: Skip big double->float inf tests on ppc64el, they are known to fail on gcc8, and the blame is being debated upstream. * Patch skip-init-locking-hurd: Skip a non-critical test that fails occasionally Hurd. (See: #893743) * Bump Standards-Version to 4.2.1, no changes needed. -- Stefano Rivera Wed, 12 Sep 2018 17:05:03 +0300 python-cffi (1.11.5-2) unstable; urgency=medium [ Ondřej Nový ] * d/tests: Use AUTOPKGTEST_TMP instead of ADTTMP * d/control: Remove ancient X-Python-Version field * d/control: Remove ancient X-Python3-Version field [ Stefano Rivera ] * Add python-cffi-doc package with Sphinx docs (Closes: #891865) * Patch: x32-pointers: handle pointers on ILP32 ABIs correctly. (Closes: #884705) * Bump Standards-Version to 4.2.0, no changes needed.
RE:python3.7-dbg issue ?
Hello, I rebuilt pyopencl,and the problem vanished. so what should I do now ? ask for a binNMU or try to understand what is going on ? thanks for your time. Fred picca@mordor:/tmp$ python3.7-dbg -c "import pyopencl" /usr/lib/python3/dist-packages/pkg_resources/_vendor/pyparsing.py:943: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated, and in 3.8 it will stop working collections.MutableMapping.register(ParseResults) /usr/lib/python3/dist-packages/pkg_resources/_vendor/pyparsing.py:3245: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated, and in 3.8 it will stop working elif isinstance( exprs, collections.Iterable ): /usr/lib/python3.7/importlib/_bootstrap_external.py:434: ImportWarning: Not importing directory /usr/lib/python3/dist-packages/logilab: missing __init__ _warnings.warn(msg.format(portions[0]), ImportWarning)
RE:python3.7-dbg issue ?
Ok, one more step, and this time I really need your advices :)) $ python3.7-dbg -c "import pyopencl" /usr/lib/python3/dist-packages/pkg_resources/_vendor/pyparsing.py:943: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated, and in 3.8 it will stop working collections.MutableMapping.register(ParseResults) /usr/lib/python3/dist-packages/pkg_resources/_vendor/pyparsing.py:3245: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated, and in 3.8 it will stop working elif isinstance( exprs, collections.Iterable ): /usr/lib/python3.7/importlib/_bootstrap_external.py:434: ImportWarning: Not importing directory /usr/lib/python3/dist-packages/logilab: missing __init__ _warnings.warn(msg.format(portions[0]), ImportWarning) * ob object : type: tuple refcount: 0 address : 0xb5f507a4 * op->_ob_prev->_ob_next object : Erreur de segmentation picca@mordor:~$ python3.7 -c "import pyopencl" maybe the problem is in pyopencl. But I find really strange thaht I have the WArning messages with python3.7-dbg but not with python3.7 Is it normal ?
RE:python3.7-dbg issue ?
Ok, I could simplify the problem to a single import picca@mordor:~$ python3.7-dbg Python 3.7.0+ (default, Aug 31 2018, 23:21:37) [GCC 8.2.0] on linux Type "help", "copyright", "credits" or "license" for more information. >>> import silx.opencl /usr/lib/python3/dist-packages/pkg_resources/_vendor/pyparsing.py:943: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated, and in 3.8 it will stop working collections.MutableMapping.register(ParseResults) /usr/lib/python3/dist-packages/pkg_resources/_vendor/pyparsing.py:3245: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated, and in 3.8 it will stop working elif isinstance( exprs, collections.Iterable ): /usr/lib/python3.7/importlib/_bootstrap_external.py:434: ImportWarning: Not importing directory /usr/lib/python3/dist-packages/logilab: missing __init__ _warnings.warn(msg.format(portions[0]), ImportWarning) >>> exit Use exit() or Ctrl-D (i.e. EOF) to exit >>> * ob object : type: tuple refcount: 0 address : 0xb5c5a68c * op->_ob_prev->_ob_next object : Erreur de segmentation
python3.7-dbg issue ?
Hello, I try to understand this [1] test failure with python3.7-dbg so, I ran this on my unstable box and I got this error from within gdb testTrainingSpectrumReading (specfilewrapperTest.testSpecfilewrapper) ... ok -- Ran 43 tests in 489.199s OK * ob object : type: tuple refcount: 0 address : 0xab326d8c * op->_ob_prev->_ob_next object : Thread 1 "python3.7-dbg" received signal SIGSEGV, Segmentation fault. 0x080a8f1d in PyObject_Repr (v=) at ../Objects/object.c:457 457 ../Objects/object.c: Aucun fichier ou dossier de ce type. (gdb) bt #0 0x080a8f1d in PyObject_Repr (v=) at ../Objects/object.c:457 #1 0x080a939a in PyObject_Print (op=, fp=0xb7e0dce0 <_IO_2_1_stderr_>, flags=0) at ../Objects/object.c:369 #2 0x080a94dd in _PyObject_Dump (op=) at ../Objects/object.c:427 #3 0x080a95af in _Py_ForgetReference (op=Python Exception Type does not have a target.: ) at ../Objects/object.c:1909 #4 0x080a8ea8 in _Py_Dealloc (Python Exception Type does not have a target.: op=) at ../Objects/object.c:1932 #5 0x080980e4 in free_keys_object (keys=keys@entry=0x91d0280) at ../Objects/dictobject.c:556 #6 0x08099ad6 in dict_dealloc (mp=0xb7a973f4) at ../Objects/dictobject.c:1911 #7 0x080a8ead in _Py_Dealloc (Python Exception Type does not have a target.: op=) at ../Objects/object.c:1933 #8 0x0813e444 in _PyImport_Fini () at ../Python/import.c:283 #9 0x0814c121 in Py_FinalizeEx () at ../Python/pylifecycle.c:1211 #10 0x0814c3ec in Py_Exit (sts=0) at ../Python/pylifecycle.c:2238 #11 0x08154661 in handle_system_exit () at ../Python/pythonrun.c:636 #12 0x08155d13 in PyErr_PrintEx (set_sys_last_vars=1) at ../Python/pythonrun.c:646 #13 0x08156083 in PyErr_Print () at ../Python/pythonrun.c:542 #14 0x08067583 in pymain_run_module (modname=, set_argv0=set_argv0@entry=1) at ../Modules/main.c:322 #15 0x08067ca0 in pymain_run_python (pymain=pymain@entry=0xb3a0) at ../Modules/main.c:2630 #16 0x0806934f in pymain_main (pymain=pymain@entry=0xb3a0) at ../Modules/main.c:2782 #17 0x080694c2 in _Py_UnixMain (argc=3, argv=0xb534) at ../Modules/main.c:2817 #18 0x08064ecd in main (argc=3, argv=0xb534) at ../Programs/python.c:15 I personnaly do not have the knowledge in order to understand the issue. So I would like your help in order to find the culprite here. thanks Frederic [1] https://salsa.debian.org/science-team/pymca/-/jobs/45873/raw
pybuild and pytest.
Hello, I am trying to upgrade spyder and activate the unti test during the build. But when pybuild try the test it ends with this error message dh_auto_test: pybuild --test --test-pytest -i python{version} -p "3.7 3.6" returned exit code 13 With no other information. So My question is how can I investigate if I do not have any information ofabout this failure ? thanks for your help. Frederic dh_auto_test -O--buildsystem=pybuild pybuild --test -i python{version} -p 2.7 pybuild --test --test-pytest -i python{version} -p "3.7 3.6" I: pybuild base:217: cd /<>/spyder-3.3.0+dfsg1/.pybuild/cpython3_3.7_spyder/build; python3.7 -m pytest = test session starts == platform linux -- Python 3.7.0+, pytest-3.6.4, py-1.5.4, pluggy-0.6.0 PyQt5 5.11.2 -- Qt runtime 5.11.1 -- Qt compiled 5.11.1 rootdir: /<>/spyder-3.3.0+dfsg1, inifile: plugins: qt-2.3.1, flaky-3.3.0 E: pybuild pybuild:338: test: plugin distutils failed with: exit code=1: cd /<>/spyder-3.3.0+dfsg1/.pybuild/cpython3_3.7_spyder/build; python3.7 -m pytest dh_auto_test: pybuild --test --test-pytest -i python{version} -p "3.7 3.6" returned exit code 13 make: *** [debian/rules:18: build] Error 25
RE:Is there a tool to debianize for the first time a package from pypi
> remove control file and invoke py2dsp - it will regenerate it > That said, you probably want dch (debchange) rather than new control > file Thanks a lot, It would be nice to have an equivalent of dgit-main-xxx for maintaining python packages. Maybe in the policy ? this way peoples new commers etc... will use the right workflow to maintain python packages. my 2cents Cheers Fred
RE:Is there a tool to debianize for the first time a package from pypi
Hello, once debianize, is there a command which allow to update the control file for a new upsteam version in order to take into account the new python dependencies ? It would simplify a lot the maintenance of python packages. py2dsp update like the cme command ? Cheers Frederic
RE:Is there a tool to debianize for the first time a package from pypi
no problem Ghislain, I am on it :)) Cheers Fred
Is there a tool to debianize for the first time a package from pypi
Hello, I need to create a new package for spyder_kernels. Is there a tool which allows to create the first version of a package which create the debian/directory from the setup.py files ? thanks Frederic
RE:how to deal with py-file-not-bytecompiled
> A bit off-topic, but you should not use Qt 4 in new packages. > See https://lists.debian.org/debian-devel-announce/2017/08/msg6.html. I did the migration to qtcreator :)) do not worry Fred
RE:how to deal with py-file-not-bytecompiled
thanks a lot
how to deal with py-file-not-bytecompiled
Hello, I am preparing the new silx package and I got these error messages from adequate This package produce the silx python module but install also a bunch of files for qtdesigner in the rules file with this command. # install the qtdesigner files only for the python3 package dh_install -p python3-silx qtdesigner_plugins/*.py /usr/lib/qt4/plugins/designer/python So I would like to know what is the proper way to deal with this. thanks for your help. Frederic 1m30.3s DEBUG: Starting command: ['adequate', '--root', '/var/run/schroot/mount/unstable-amd64-sbuild-0a8521f2-961d-11e8-8726-a0369f838f54-piuparts', 'python-silx-dbg', 'python-silx-dbgsym', 'python-silx-doc', 'python-silx', 'python3-silx-dbg', 'python3-silx-dbgsym', 'python3-silx', 'silx'] 1m30.6s DUMP: python3-silx: py-file-not-bytecompiled /usr/lib/qt4/plugins/designer/python/plot1dplugin.py python3-silx: py-file-not-bytecompiled /usr/lib/qt4/plugins/designer/python/plot2dplugin.py python3-silx: py-file-not-bytecompiled /usr/lib/qt4/plugins/designer/python/plotwidgetplugin.py python3-silx: py-file-not-bytecompiled /usr/lib/qt4/plugins/designer/python/plotwindowplugin.py 1m30.6s DEBUG: Command ok: ['adequate', '--root', '/var/run/schroot/mount/unstable-amd64-sbuild-0a8521f2-961d-11e8-8726-a0369f838f54-piuparts', 'python-silx-dbg', 'python-silx-dbgsym', 'python-silx-doc', 'python-silx', 'python3-silx-dbg', 'python3-silx-dbgsym', 'python3-silx', 'silx'] 1m30.6s ERROR: FAIL: Inadequate results from running adequate! python3-silx: py-file-not-bytecompiled /usr/lib/qt4/plugins/designer/python/plot1dplugin.py python3-silx: py-file-not-bytecompiled /usr/lib/qt4/plugins/designer/python/plot2dplugin.py python3-silx: py-file-not-bytecompiled /usr/lib/qt4/plugins/designer/python/plotwidgetplugin.py python3-silx: py-file-not-bytecompiled /usr/lib/qt4/plugins/designer/python/plotwindowplugin.py 1m30.6s ERROR: FAIL: Running adequate resulted in inadequate tags found: py-file-not-bytecompiled
RE:packages that use dh_python{2,3} but don't depend on dh-python
What about teaching cme how to fix packages build-depends. this way a simple cme fix dpkg would do the job ?
RE:Python2 EOL and moving towards Python3
what about, packages which contain in the rules files direct call to dh_numpy2 ? cheers Frederic
RE:providing sphinx3-* binaries
Or just use the sphinx-generated Makefile if there is one: Except that when there is an autodoc in the documentation, I like to build the doc with all {interpreters}. It is a sort of unit test Cheers Fred
RE:providing sphinx3-* binaries
Hello guyes. > override_dh_sphinxdoc: > ifeq (,$(findstring nodocs, $(DEB_BUILD_OPTIONS))) nodocs or nodoc I alsa do something like this when there is extensions. override_dh_sphinxdoc: ifeq (,$(findstring nodoc, $(DEB_BUILD_OPTIONS))) PYBUILD_SYSTEM=custom \ PYBUILD_BUILD_ARGS="cd docs && PYTHONPATH={build_dir} http_proxy='127.0.0.1:9' {interpreter} -m sphinx -N -bhtml source build/html" dh_auto_build # HTML generator dh_installdocs "docs/build/html" -p python-gpyfft-doc dh_sphinxdoc -O--buildsystem=pybuild endif cheers Frederic
-dbg packages and dependencies
Hello, it seems that dependencies are not generated for the -dbg packages by dh_python[23] Is there s way to ask dh_python to generate these dependencies from the build-dep of a package. Thanks Frederic
RE:pydist and install_requires
Hello Ghislain > Indeed, you need to use the name registered on pypi, which can be > different from the Debian package name. For OpenGL, the project is > registered as PyOpenGL, for PyQt5 the name is PyQt5. yes it works :) > but for pyqt5 I do not have egg informations. Maybe the solution would be to > add egg info to pyqt5. > But for the split ??? > There are no individual registered names for the PyQt5 components, so > you can only provide PyQt5 in the setup.py metadata, and you must list > the necessary components yourself in d/control. yes in my case I will override pyqt5 in the pydist-override in order to add the right dependencies. Cheers Fred
RE:pydist and install_requires
Hello Andrey > Isn't just adding the package names to Depends easier? I just want this to be automatically generated and "upstreamable". > So do you have python-opengl, python-pyqt5 etc in Build-Depends? yes, I think that I found the problem for opengl. the egg info gives the name of the project which is PyOpenGL. (I am testing this now) but for pyqt5 I do not have egg informations. Maybe the solution would be to add egg info to pyqt5. But for the split ??? Cheers Frederic
pydist and install_requires
Hello, still working on my silx package... When it comes to python:Depends, I try to add the right entries in the setup.py So I added a bunch of modules there. -"setuptools"] +"setuptools", +# Debian added +"ipython", +"qtconsole", +"enum34", +"fabio", +"h5py", +"lxml", +"mako", +"matplotlib", +"OpenGL", +"pil", +"pyopencl", +"PyQt5", +"PyQt5.qtopengl", +"PyQt5.qtsvg", +"scipy" +] But when compiling I get these informations from dh_python2 and dh_python3 I: dh_python2 pydist:220: Cannot find package that provides opengl. Please add package that provides it to Build-Depends or add "opengl python-opengl" line to debian/pydist-overrides or add proper dependency to Depends by hand and ignore this info. I: dh_python2 pydist:220: Cannot find package that provides pyqt5. Please add package that provides it to Build-Depends or add "pyqt5 python-pyqt5" line to debian/pydist-overrides or add proper dependency to Depends by hand and ignore this info. I: dh_python2 pydist:220: Cannot find package that provides pyqt5.qtopengl. Please add package that provides it to Build-Depends or add "pyqt5.qtopengl python-pyqt5.qtopengl" line to debian/pydist-overrides or add proper dependency to Depends by hand and ignore this info. I: dh_python2 pydist:220: Cannot find package that provides pyqt5.qtsvg. Please add package that provides it to Build-Depends or add "pyqt5.qtsvg python-pyqt5.qtsvg" line to debian/pydist-overrides or add proper dependency to Depends by hand and ignore this info. So my question is why do I have OpenGL -> opengl in the dh_python2 message. (is it why it can not find the right pacakge ?) for pyqt5 it seems more complicate since the debian packqges split the PyQt5 namespace. What should I put in order to have the right auto-generated Dependecies. thanks for your help Frederic
Disabling test does not work, is it a pybuild bug ?
Hello before reporting a bug against dh-python, I would like your opinion, I try to skip test for all the debug version of the interpreter. So I add this in the rules files export PYBUILD_DISABLE_python2-dbg=test export PYBUILD_DISABLE_python3-dbg=test and here is my test target # WITH_QT_TEST=False to disable graphical tests # SILX_OPENCL=False to disable OpenCL tests # SILX_TEST_LOW_MEM=True to disable tests taking large amount of memory # GPU=False to disable the use of a GPU with OpenCL test # WITH_GL_TEST=False to disable tests using OpenGL override_dh_auto_test: mkdir -p $(POCL_CACHE_DIR) # create POCL cachedir in order to avoid an FTBFS in sbuild dh_auto_test -- -s custom --test-args="env PYTHONPATH={build_dir} GPU=False SILX_OPENCL=False SILX_TEST_LAW_MEM=True xvfb-run -a --server-args=\"-screen 0 1024x768x24\" {interpreter} run_tests.py -v" it works on jessie, stretch but on unstable, I get this I: pybuild base:184: env PYTHONPATH=/<>/silx-0.5.0+dfsg/.pybuild/pythonX.Y-dbg_2.7/build GPU=False SILX_OPENCL=False SILX_TEST_LAW_MEM=True xvfb-run -a --server-args="-screen 0 1024x768x24" python2.7-dbg run_tests.py -v WARNING:run_tests:h5py missing: /usr/lib/python2.7/dist-packages/h5py/_errors.i386-linux-gnu.so: undefined symbol: Py_InitModule4 INFO:silx.setup:Use setuptools INFO:silx.setup:Use setuptools.setup [201131 refs] WARNING:run_tests:Test silx 0.5.0 from /<>/silx-0.5.0+dfsg/.pybuild/pythonX.Y-dbg_2.7/build/silx /usr/lib/python2.7/dist-packages/subprocess32.py:472: RuntimeWarning: The _posixsubprocess module is not being used. Child process reliability may suffer if your program uses threads. "program uses threads.", RuntimeWarning) /usr/lib/python2.7/dist-packages/IPython/qt.py:13: ShimWarning: The `IPython.qt` package has been deprecated. You should import from qtconsole instead. "You should import from qtconsole instead.", ShimWarning) ERROR:silx.io.octaveh5:Module silx.io.octaveh5 requires h5py WARNING:silx.opencl:Use of OpenCL has been disables from environment variable: SILX_OPENCL=0 libEGL warning: DRI2: failed to open swrast (search paths /usr/lib/i386-linux-gnu/dri:${ORIGIN}/dri:/usr/lib/dri) libEGL warning: DRI2: failed to open swrast (search paths /usr/lib/i386-linux-gnu/dri:${ORIGIN}/dri:/usr/lib/dri) ERROR:silx.gui.hdf5._utils:Module silx.gui.hdf5._utils requires h5py Traceback (most recent call last): File "run_tests.py", line 369, in unittest.defaultTestLoader.loadTestsFromNames(options.test_name)) File "/usr/lib/python2.7/unittest/loader.py", line 130, in loadTestsFromNames suites = [self.loadTestsFromName(name, module) for name in names] File "/usr/lib/python2.7/unittest/loader.py", line 115, in loadTestsFromName test = obj() File "/<>/silx-0.5.0+dfsg/.pybuild/pythonX.Y-dbg_2.7/build/silx/test/__init__.py", line 59, in suite test_suite.addTest(test_gui.suite()) File "/<>/silx-0.5.0+dfsg/.pybuild/pythonX.Y-dbg_2.7/build/silx/gui/test/__init__.py", line 72, in suite from ..hdf5 import test as test_hdf5 File "/<>/silx-0.5.0+dfsg/.pybuild/pythonX.Y-dbg_2.7/build/silx/gui/hdf5/__init__.py", line 38, in from .Hdf5TreeView import Hdf5TreeView # noqa File "/<>/silx-0.5.0+dfsg/.pybuild/pythonX.Y-dbg_2.7/build/silx/gui/hdf5/Hdf5TreeView.py", line 34, in from .Hdf5TreeModel import Hdf5TreeModel File "/<>/silx-0.5.0+dfsg/.pybuild/pythonX.Y-dbg_2.7/build/silx/gui/hdf5/Hdf5TreeModel.py", line 36, in from .Hdf5Item import Hdf5Item File "/<>/silx-0.5.0+dfsg/.pybuild/pythonX.Y-dbg_2.7/build/silx/gui/hdf5/Hdf5Item.py", line 36, in from . import _utils File "/<>/silx-0.5.0+dfsg/.pybuild/pythonX.Y-dbg_2.7/build/silx/gui/hdf5/_utils.py", line 46, in raise e ImportError: /usr/lib/python2.7/dist-packages/h5py/_errors.i386-linux-gnu.so: undefined symbol: Py_InitModule4 [795788 refs] E: pybuild pybuild:283: test: plugin custom failed with: exit code=1: env PYTHONPATH=/<>/silx-0.5.0+dfsg/.pybuild/pythonX.Y-dbg_2.7/build GPU=False SILX_OPENCL=False SILX_TEST_LAW_MEM=True xvfb-run -a --server-args="-screen 0 1024x768x24" python2.7-dbg run_tests.py -v dh_auto_test: pybuild --test -i python{version}-dbg -p 2.7 -s custom "--test-args=env PYTHONPATH={build_dir} GPU=False SILX_OPENCL=False SILX_TEST_LAW_MEM=True xvfb-run -a --server-args=\"-screen 0 1024x768x24\" {interpreter} run_tests.py -v" returned exit code 13 debian/rules:54: recipe for target 'override_dh_auto_test' failed It seems that pybuild do not honor the DISABLE. Did I found a bug ? thanks for your help Frédéric
502 Bad Gateway with the pipy redirector ?
Hello, When I tries to update one of my package, I got this errro via uscan :~/Debian/lmfit-py/lmfit-py$ uscan uscan warn: In watchfile debian/watch, reading webpage https://pypi.debian.net/lmfit/ failed: 502 Bad Gateway Someone knowns what is going on ? Cheers Frederic
RE:building manpages via setup.py
Ok, If I replace '{interpreter} setup.py build_man' by 'env; setup.py build_man' I get this HOME=/home/picca PYTHONPATH=/home/picca/Debian/silx/silx/.pybuild/pythonX.Y_2.7/build but If I read the code of def create_pydistutils_cfg(func): """distutils doesn't have sane command-line API - this decorator creates .pydistutils.cfg file to workaround it hint: if you think this is plain stupid, please don't read distutils/setuptools/distribute sources """ def wrapped_func(self, context, args, *oargs, **kwargs): fpath = join(args['home_dir'], '.pydistutils.cfg') if not exists(fpath): with open(fpath, 'w', encoding='utf-8') as fp: lines = ['[clean]\n', 'all=1\n', '[build]\n', 'build-lib={}\n'.format(args['build_dir']), '[install]\n', 'force=1\n', 'install-layout=deb\n', 'install-scripts=/usr/bin\n', 'install-lib={}\n'.format(args['install_dir']), '[easy_install]\n', 'allow_hosts=None\n'] log.debug('pydistutils config file:\n%s', ''.join(lines)) fp.writelines(lines) context['ENV']['HOME'] = args['home_dir'] return func(self, context, args, *oargs, **kwargs) we can see that the HOME should be home_dir HOME should be /home/picca/Debian/silx/silx/.pybuild/pythonX.Y_2.7 I think that there is a problem here. right ?
RE:building manpages via setup.py
So I am sltill investigating with some debug output, I can see this the first build part gives dh_auto_build -- --after-build '{interpreter} setup.py build_man' pybuild --build -i python{version} -p 2.7 --after-build "{interpreter} setup.py build_man" D: pybuild pybuild:474: version: 2.20170125 D: pybuild pybuild:475: ['/usr/bin/pybuild', '--build', '-i', 'python{version}', '-p', '2.7', '--after-build', '{interpreter} setup.py build_man'] D: pybuild pybuild:36: cfg: Namespace(after_build='{interpreter} setup.py build_man', after_clean=None, after_configure=None, after_install=None, after_test=None, before_build=None, before_clean=None, before_configure=None, before_install=None, before_test=None, build_args=None, build_only=True, clean_args=None, clean_only=False, configure_args=None, configure_only=False, custom_tests=False, destdir='debian/tmp', detect_only=False, dir='/home/picca/Debian/silx/silx', disable=None, ext_destdir=None, ext_pattern='\\.so(\\.[^/]*)?$', install_args=None, install_dir=None, install_only=False, interpreter=['python{version}'], list_systems=False, name='silx', quiet=False, really_quiet=False, system=None, test_args=None, test_nose=False, test_only=False, test_pytest=False, test_tox=False, verbose=True, versions=['2.7']) D: pybuild pybuild:103: detected build system: distutils (certainty: 61%) I: pybuild base:184: /usr/bin/python setup.py build D: pybuild tools:217: invoking: /usr/bin/python setup.py build INFO:silx.setup:Use setuptools running build running build_py creating /home/picca/Debian/silx/silx/.pybuild/pythonX.Y_2.7/build/silx <- right path the after part execute the build_man part BUT I: pybuild pybuild:242: python2.7 setup.py build_man D: pybuild tools:217: invoking: python2.7 setup.py build_man INFO:silx.setup:Use setuptools running build_man running build_py creating build/lib.linux-i386-2.7 creating build/lib.linux-i386-2.7/silx <- ko So the after part does not seems to take into account the .pydistutils.cfg file ??? Is it normal ?