Hello, I am working on the silx package and the upstream install_requires is
sort of wrong.
It depends on the hdf5plugin, which is not necessary on Debian.
the purpose of this hdf5plugin is to register an hdf5 pluging when it is
uploaded.
the code of the application use a try execpt in order
> Hello,
> I’d suggest you build it from source (python.org/ftp... with the needed
> version) as an additional python version, and then create your venv using the
> 3.6.
> You can dm me if you might need more details.
It would be great to have a python-builder package whcih generates a
Hello, I am working on the lmfit-py package
lintian complain about this
https://salsa.debian.org/science-team/lmfit-py/-/jobs/909498
I use sphinx, so my question is: do you know how to fix this issue
lot's of package are affected by this
What about helping
https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=946035
python-language-server
is the only one missing now.
cheers
> If you want to embed python in an application, you need to use
> python3-embed.pc
> Or python3-config --embed
then it links the program with -lpython3.8
so what is the purpose of python3.pc ?
thanks
Fred
Hello , I am writing a program in haskell which use Python3
It failes with
Linking dist/build/binoculars/binoculars ...
src/Hkl/Python/Pyfi.hs:124 : erreur : référence à « Py_BuildValue » non définie
src/Hkl/Python/Pyfi.hs:199 : erreur : référence à « PyUnicode_AsUTF8 » non
définie
I found this
--sourcedirectory=src
is it equivalent to -D
subsidiary question is it possible to run a command before all the dh_auto_xxx
without overrideing eveythings ?
I need to run a command whcih generate the setup.py file so I need to do
override_dh_auto_:
do_something
Hello, I have a packahe where the setup.py is not located at the root of the
directory.
So I need to do
override_dh_auto_XXX:
dh_auto_XXX -- -d
Is there a export somthing whcih allows to says where is the setup.py to deal
with ?
thanks
Frederic
> You should consider /usr/lib// if you want to make your
> package multiarch-safe.
And what about ?
/usr/lib//
whcih one is better ?
> > The issue is that the current build system do not provide rpath for
> > these libraries so I can not add one via chrpath.
> Well, ideally you need to fix the build system so that it sets the correct
> rpath directly.
I found patchelf whcih allows to add a rpath :))
So I just need to set
> Lintian-brush is a fine tool, but (correct me if I am wrong) it would
> generate a patch excluding badges, and patches require maintenance.
You are right
maybe we should have a dh_privacy helper for this purpose.
cheers
Fred
what about lintian brush ?
Hello, I am working on the vitables package.
during the build I get this error message [1]
= test session starts ==
1567 platform linux -- Python 3.7.6, pytest-4.6.9, py-1.8.1, pluggy-0.13.0
1568 rootdir:
Indded
sorry for the noise
Fredric
Hello,
I am packaging a python application . So I dediced to put the module under the
private directory
/usr/share/
but this software contain a cython extension.
So at the end I have a lintian Error due to binary file under /usr/share.
What is the best soltuion when we need to package a
Hello
on unstable it works but on buster (and we need to make it work for buster)
we have this message
Python 3.7.3 (default, Apr 3 2019, 05:39:12)
[GCC 8.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import pkg_resources
>>>
Hello,
We are working on the next pyfai package.
This new version use entry_points like this
gui_requires = ['PyQt5', 'h5py', 'hdf5plugin', 'PyOpenGL']
opencl_requires = ['pyopencl']
extras_require = {
'calib2': gui_requires, # Keep compatibility
'gui': gui_requires,
Hello,
and if at the end the upstream could take care of the Debian packaging, by
adding a
.salsa-ci.yml in the upstream directory, in order to have a feedback with nice
badges ?
Cheers
Hello, in one of my package (pymca), there is a syntax error like this.
byte-compiling
/builds/science-team/pymca/debian/output/pymca-5.5.2+dfsg/debian/python-pymca5/usr/lib/python2.7/dist-packages/PyMca5/Object3D/Object3DPlugins/ChimeraStack.py
to ChimeraStack.pyc
File
Hello, I would like to know if there is an equivalent of cabal-debian[1] ,
which helps a lot maintaining haskell packages.
I allow to do the initial packaging and also the upgrade of the packaging, by
updating the Build-dependencies, etc...
Cheers
Frederic
[1]
Hello sandro,
I can not find python-pyqtgraph in your list.
It seems to me that this package has reverse dependencies, but the python2
binaries where remove..., but this is another problem.
Cheers
Frederic
Hello Sandro,
> I've just submitted
> https://salsa.debian.org/python-team/tools/dh-python/merge_requests/9
> to address this; not sure how quickly it will get merged & released
thanks a lot, but what about backports. On backports we still need this mapping.
Cheers
Hello,
I am preparing the new spyder package.
since the removal of pylint3 from the src:pylint.
I need to remove the Build-Depends: pylint3.
Now dh_python3 still produce a pylint3 dependency for the binary packages.
It seems that thsi is hard coded here[0]
Is it a bug of dh-python ?
Cheers
Hello it seems that the pylint package does not provide pylint3 anymore (since
21h ;)
But the spyder package still require pylint3 and pylint when installint spyder
or spyder3.
this is why the next taurus will FTBFS.
So is it something expected and the spyder package should be fixed, or a bug
it would be nice, if the python2 packages could be skipped in bulleye but not
for backports in buster.
Is it something which could be envision ?
Fred
Hello,
while preparing one of our package (pyfai), we endup with an FTBFS due to
backports.functools_lru_cache [1].
here the backtrace
Traceback (most recent call last):
File "./run_tests.py", line 543, in
unittest.defaultTestLoader.loadTestsFromNames(options.test_name))
File
Hello, here a diff between the python3.6 and python3.7 modules once updated
via 2to3.
r:/tmp$ diff core*
174c174
< for (variance, tag) in zip(variances, tags))
---
> for (variance, tag) in list(zip(variances, tags)))
181c181
< for (coords, value) in
Hello Andreas,
> Patches are welcome (I have no idea what the construct is doing neither
> how to replace it with something valid).
> Patch welcome as well - preferably as commit to Git.
done but now, we need to understand why lintian complain about python module at
the wrong place before
I think that the real problem with the current build is that the conf.py file
change the sys.path.
this is why we see this syntax eror.
sphinx pick the wrong path.
I can not work on this now..., I am not in fron of a Debian box nor have
access to one todays...
Cheers
Fred
I found in the code a string with a ur''
This is the problematic line.
I do not know if this is a valid string construction.
I also dound that you need to remove the sys.path modifications from the
conf.py.
this can cause some troubles during the build.
Cheers.
Fred
I found the culprite, the conf.py file of the documentations prepend ".." to
sys.path befor importing the module.
This is why it use the wrong version of the built module.
Now during the build I have this
D: dh_python3 dh_python3:164: args: []
D: dh_python3 dh_python3:166: supported Python
Now the test part :))
Correlated variables. ... ok
Tests the input of correlated value. ... ok
==
ERROR: Failure: ImportError (No module named tests.support)
> try adding python3-setuptools to Build-Depends
ok I removed all the black magic from the debian/rules and added setuptools :)
so now, I have this error when building the documentation
PYTHONPATH=`pybuild --print build_dir --interpreter python3`
http_proxy='http://127.0.0.1:9/' sphinx-build
You are right , I did not noticed, that setuptools was not part of the build
dependencies...
Hello Andreas, it seems to me that the problem is due to the 2to3 conversion.
I looked at the first failure when you re-activate the unit test[1]
to my opinion, the code is modify in place with 2to3.
So the code on the source after the configuration is already converted to
python3.
And during
Hello Andreas,
during the test does it load the moduels from the source files or does it use
the one under the build directory.
Maybe there is a missmatch between python2 code and 2to3 python code targeting
python3.
did it helped ?
Fred
I think that there is a problem with cffi
pyopencl was built with
python3-cffi-backend i386 1.11.5-1 [80.2 kB]
but the backend used for the test is the current 1.11.5-3.
here the Debian changelog
python-cffi (1.11.5-3) unstable; urgency=medium
[ Ondřej Nový ]
* Use 'python3 -m
Hello,
I rebuilt pyopencl,and the problem vanished.
so what should I do now ?
ask for a binNMU or try to understand what is going on ?
thanks for your time.
Fred
picca@mordor:/tmp$ python3.7-dbg -c "import pyopencl"
/usr/lib/python3/dist-packages/pkg_resources/_vendor/pyparsing.py:943:
Ok, one more step, and this time I really need your advices :))
$ python3.7-dbg -c "import pyopencl"
/usr/lib/python3/dist-packages/pkg_resources/_vendor/pyparsing.py:943:
DeprecationWarning: Using or importing the ABCs from 'collections' instead of
from 'collections.abc' is deprecated, and in
Ok, I could simplify the problem to a single import
picca@mordor:~$ python3.7-dbg
Python 3.7.0+ (default, Aug 31 2018, 23:21:37)
[GCC 8.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import silx.opencl
Hello,
I try to understand this [1] test failure with python3.7-dbg
so, I ran this on my unstable box and I got this error from within gdb
testTrainingSpectrumReading (specfilewrapperTest.testSpecfilewrapper) ... ok
--
Ran 43
Hello,
I am trying to upgrade spyder and activate the unti test during the build.
But when pybuild try the test it ends with this error message
dh_auto_test: pybuild --test --test-pytest -i python{version} -p "3.7 3.6"
returned exit code 13
With no other information.
So My question is how
> remove control file and invoke py2dsp - it will regenerate it
> That said, you probably want dch (debchange) rather than new control
> file
Thanks a lot,
It would be nice to have an equivalent of dgit-main-xxx for maintaining python
packages.
Maybe in the policy ?
this way peoples new
Hello,
once debianize, is there a command which allow to update the control file for a
new upsteam version in order to take into account the new python dependencies ?
It would simplify a lot the maintenance of python packages.
py2dsp update
like the cme command ?
Cheers
Frederic
no problem Ghislain, I am on it :))
Cheers
Fred
Hello, I need to create a new package for
spyder_kernels.
Is there a tool which allows to create the first version of a package which
create the debian/directory from the setup.py files ?
thanks
Frederic
> A bit off-topic, but you should not use Qt 4 in new packages.
> See https://lists.debian.org/debian-devel-announce/2017/08/msg6.html.
I did the migration to qtcreator :))
do not worry
Fred
thanks a lot
Hello, I am preparing the new silx package and I got these error messages from
adequate
This package produce the silx python module but install also a bunch of files
for qtdesigner
in the rules file with this command.
# install the qtdesigner files only for the python3 package
What about teaching cme how to fix packages build-depends.
this way a simple
cme fix dpkg
would do the job ?
Or just use the sphinx-generated Makefile if there is one:
Except that when there is an autodoc in the documentation, I like to build the
doc with all {interpreters}.
It is a sort of unit test
Cheers
Fred
Hello guyes.
> override_dh_sphinxdoc:
> ifeq (,$(findstring nodocs, $(DEB_BUILD_OPTIONS)))
nodocs or nodoc
I alsa do something like this when there is extensions.
override_dh_sphinxdoc:
ifeq (,$(findstring nodoc, $(DEB_BUILD_OPTIONS)))
Hello, it seems that dependencies are not generated for the -dbg packages by
dh_python[23]
Is there s way to ask dh_python to generate these dependencies from the
build-dep of a package.
Thanks
Frederic
Hello Ghislain
> Indeed, you need to use the name registered on pypi, which can be
> different from the Debian package name. For OpenGL, the project is
> registered as PyOpenGL, for PyQt5 the name is PyQt5.
yes it works :)
> but for pyqt5 I do not have egg informations. Maybe the solution would
Hello Andrey
> Isn't just adding the package names to Depends easier?
I just want this to be automatically generated and "upstreamable".
> So do you have python-opengl, python-pyqt5 etc in Build-Depends?
yes, I think that I found the problem for opengl.
the egg info gives the name of the
Hello,
still working on my silx package...
When it comes to python:Depends, I try to add the right entries in the setup.py
So I added a bunch of modules there.
-"setuptools"]
+"setuptools",
+# Debian added
+"ipython",
+"qtconsole",
+"enum34",
+
Hello before reporting a bug against dh-python, I would like your opinion,
I try to skip test for all the debug version of the interpreter.
So I add this in the rules files
export PYBUILD_DISABLE_python2-dbg=test
export PYBUILD_DISABLE_python3-dbg=test
and here is my test target
#
Hello, When I tries to update one of my package, I got this errro via uscan
:~/Debian/lmfit-py/lmfit-py$ uscan
uscan warn: In watchfile debian/watch, reading webpage
https://pypi.debian.net/lmfit/ failed: 502 Bad Gateway
Someone knowns what is going on ?
Cheers
Frederic
Ok, If I replace
'{interpreter} setup.py build_man'
by
'env; setup.py build_man'
I get this
HOME=/home/picca
PYTHONPATH=/home/picca/Debian/silx/silx/.pybuild/pythonX.Y_2.7/build
but If I read the code of
def create_pydistutils_cfg(func):
"""distutils doesn't have sane command-line
So I am sltill investigating
with some debug output, I can see this the first build part gives
dh_auto_build -- --after-build '{interpreter} setup.py build_man'
pybuild --build -i python{version} -p 2.7 --after-build "{interpreter}
setup.py build_man"
D: pybuild pybuild:474: version:
Hello Piotr,
I am struggling, with the build system.
I will speak about this solution
> | override_dh_auto_build:
> | dh_auto_build -- --after-build '{interpreter} setup.py build_man'
the code instrumented of the BuildMan is this one
class BuildMan(Command):
"""Command to build man
> The snippet you quoted is not specific to extension modules but to the
> use of the autodoc feature, which requires the modules to be in the
> PYTHONPATH. The `sys.path.insert` hack is just here so that you don't
> have to specify PYTHONPATH yourself when running the upstream Makefile.
I just
> First, that's very speculative. Second, that's upstream's problem.
> The upstream Makefile and conf.py are likely generated by Sphinx itself
> via sphinx-quickstart. Did your upstream tinker with them that much that
> they cannot be trusted?
No this is just that does not fit well with
> PYTHONPATH=. sphinx-build -N -b html
> One can also use the sphinx-generated Makefile if available:
> PYTHONPATH=$(CURDIR) $(MAKE) -C html
> Both are simple one-liners and do not rely on pybuild.
Yes it works but this is fragile since the organisation of the module can
change in the
> At the end of the day, it is just a matter of providing an appropriate
> PYTHONPATH, regardless of whether pybuild is used or not.
Yes but to avoid the multiplications of way to provide this PYTHONPATH.
Is it possible to have the recommended way which works for modules and
extensions.
once
> Perhaps the LibraryStyleGuide should be updated to reflect on this
> change? I believe we are still advising explicit http_proxy /
> https_proxy exports prior to running sphinx-build.
And running sphinx-build does not work expecially if there is extensions in the
documentation.
sphinx-build
> export PYBUILD_AFTER_BUILD={interpreter} setup.py build_man
Thanks, The only drawback I see with this solution is that I want to run
dh_auto_build 2 times,
- 1 for the arch part (pyhon modules, extensions and manpages)
- 2 for the indep part (doc build)
nevertheless thans a lot it is
Hello piotr
> if you want to test all requested Python interpreters:
I prefer this solution in order to check that the built extensions are working
for all the python interpreters.
The doc use autodoc so it is nice to have this functionnality
| override_dh_auto_build:
| dh_auto_build --
> Simplest I can think of would be to build the extensions inplace
> followed by the call to build_man. Something like:
> override_dh_auto_build:
> dh_auto_build
> python3 setup.py build_ext --inplace
> python3 setup.py build_man
I do not want to build once more the ext
Hello,
I am working on the pyfai package.
This pacakge contain one module with extensions (the important point)
The new upstream version 0.14.0 provide a build_man target via the setup.py
So in ordert to generate the doc I need to do
python setup.py build_man
Now if I look at this target, I
Great, +1 for this long awaited migration :))
here the maintainer of lmfit-py, sardana ans spyder
I will take care of lmfit-py. but I think that the notebook is not a strong
constrain for lmfit-py
I did the work for sardana., so it is ok in experimental
I did the work with others for spyder
Hello
> I agree with Ben that you shouldn’t really be using Sphinx for testing
> your source.
ok but thanks to this I found some missing binary dependencies during the build
process...
so it increased the quality of my packages.
Even if I would prefer a complet unit test suite...
I could put
> how about doing it outside pybuild? Do you really need to build it for
> each interpreter / version?
It is a sort of unit test during the build that auto-doc works for all versions
of the interpreter.
th eunit test is
try to import all the modules provided by a packages.
Is there something
Here for itango
Build-Depends: debhelper (>= 9),
dh-python,
python-all-dev,
python-setuptools,
python-tango,
python3-all-dev,
python3-setuptools,
python3-tango
Package: python3-itango
Hello
I am using this snipset when I want to build the sphinx documentation for all
python available. (this is a sort of unit test).
Usually the sphinx doc use auto-doc and then it allows to check that all
modules can be imported.
override_dh_sphinxdoc:
ifeq (,$(findstring nodocs,
> I think in jupyter this is meant to be handled by the --kernel option - ie
> jupyter qtconsole --kernel python2
ok it works
>jupyter qtconsole --kernel python3
$ jupyter qtconsole --kernel python3
Traceback (most recent call last):
File "/usr/bin/jupyter-qtconsole", line 9, in
Hello Julien
> Not as far as I know. I'd be happy to review a package if that would
> help.
I just pushed here my wip package
git clone
git+ssh://git.debian.org/git/python-modules/packages/python-qtconsole.git
I have debian/TODO.org file where I put the remaining things to do.
I would be
> that's because alioth interface is ignored. Unfortunately I don't know
> how to disable it and people who didn't read our policy still use it.
Sorry. I plan to read your policy in order to achieve my package, but I am
used to the alioth request button :))
Cheers
Frederic
Hello,
I did a request from the alioth website yesterday, in order to maintain
python-qtconsole inside the python-modules repository.
So this is just a ping directly from the mailing list.
thanks
Frederic
Hello,
I saw that a bunch of jupyter projects were uploaded into experimental.
It would be nice to have also the qtconsole part.
I would like to know if there is already an effort in order to package the
current
ipython-qtconsole.
I would be glade to help package this module.
One of my
Hello, just for information what is the difference with
pypi-install from the pyton-stdeb package
Hello, guyes,
I am working on this bug report[1], and I would like your opinion.
this package depends on the tango library which was rebuilt with gcc5 and
updated for the libstdc++6 transition.
Now as you can see in the bug report, pytango FTBFS with a missing symbol.
and indeed the missing
> I have created a small test project (attached) that has a library exporting
> "std::string Tango::ranges_type2const::str" symbol and a test program
> using it.
thanks a lot.
now we can see if gcc and g++ generate different symbols.
> Interestingly, GCC uses the symbol
> The actual error is about another symbol:
> _PyTango.so: undefined symbol: _ZN5Tango17ranges_type2constIjE3strE
> This symbol is old ABI, as opposed to
> _ZN5Tango17ranges_type2constIjE3strB5cxx11E
> (which *does* exist in libtango.so.8).
> Can it be a bug in GCC? I don't think it should
> I have created a small test project (attached) that has a library exporting
> "std::string Tango::ranges_type2const::str" symbol and a test program
> using it.
> Interestingly, GCC uses the symbol _ZN5Tango17ranges_type2constIjE3strE
> (*without* the B5cxx11 part) for both library and client.
Hello, thanks for your reply
you can
1) invoke dh twice with different set of options (if these modules
are in two different directories or use two different setup.py
files/options)
the use the same setup.py for now...
or
2) (ab)use --ext-dest-dir/--ext-pattern to move some files to
Hello,
I am working on a source package pymca, which will provide 2 modules
PyMca5 and fisx
now following the policy I must name the binary package this way
python-pymca5
python3-pymca5
python-fisx
python3-fisx
The previous package provided only one module, so I should use the minimalist
dh
The file is patched, but now I have an d/p/0005- file instead of a
modified
0003- patch file. Sigh.
In this case you can use
git rebase -i master
edit the commit to merge 0003- and 0005-
Cheers
Frederic
--
To UNSUBSCRIBE, email to debian-python-requ...@lists.debian.org
with a
Yes. Which wiki page? URL?
this one [1]
Ahum... maintaining a patch is probably an overstatement here.
Here's an example diff (not a valid diff for computers, but enough for
understanding by humans):
extensions = ['sphinx.ext.autodoc',
'sphinx.ext.todo',
Hello
I try to find a way to teach my rules files how to generate the documentation
using pybuild
I found the classique
override_dh_auto_build-indep:
dh_auto_build
PYTHONPATH=. http_proxy='localhost' sphinx-build -N -bhtml doc/source
build/html # HTML generator
but is does
You might want to harvest hints given in [0] which hopefully allow you
to keep gbp and use git-svn.
Hope this helps.
I will have a look, thanks for the link
Cheers
Frederic
--
To UNSUBSCRIBE, email to debian-python-requ...@lists.debian.org
with a subject of unsubscribe. Trouble? Contact
Hello,
I would like to update the current rope package
0.9.2 - 0.9.4 and refresh the packaging using pybuild because the ropemacs
mode
doesn not work properly with 0.9.2., but seems to work fluently with 0.9.4
now, I can not afford to learn svn-buildpackage. days are too short...
I use only
Hello,
I am packaging the next version of spyder which is a python IDE.
The next version will support python3
so I need to add a spyder3 binary package which will contain
/usr/bin/spyder3
the upstream script only create /usr/bin/spyder during the build
So my question is do we have something in
Hello,
I am working on this bug [1] , after some investigration it seems that this is
a problem of incompatibility between
python-qwt5 and python-qt4.
I do not know what is the best way to solve this issue.
rebuild python-qwt5-qt4 but python-qt4 should be fixed also to set the right
Breaks.
94 matches
Mail list logo