Help with package python-using package

2002-02-07 Thread Julian Gilbey
[Please Cc: me with replies; I'm not subscribed.]

I'm not sure what to do with this python-using package I'm interested
in packaging.  Not being a python expert, I'm a bit stumped.
(Although I am planning to learn)  Here's the lowdown:

It contains a python script.  It contains a compiled library (helper)
program.  And it contains a python module used by the primary script.

How should I package it?  In particular, where do I put the module and
do I have to compile it (and how do I do that)?  I can't quite figure
out what to do based on the Python Policy.

Thanks!

   Julian

-- 
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-

 Julian Gilbey, Dept of Maths, Debian GNU/Linux Developer
  Queen Mary, Univ. of London see http://people.debian.org/~jdg/
   http://www.maths.qmul.ac.uk/~jdg/   or http://www.debian.org/
Visit http://www.thehungersite.com/ to help feed the hungry




policy 2.3 para 2

2002-02-09 Thread Julian Gilbey
I have a suggestion, which may already have been thought of.

Need: a python-module (pure Python) providing package should provide
byte-compiled versions for all installed python versions (as long as
there are no version dependency issues)

Parallel: an emacs-module providing package should provide
byte-compiled versions for all installed emacs versions (as long as
there are no version dependency issues)

Why not take the emacsen-common method and code and use this for
python?  It probably won't work for C-extension modules, but it could
make life easier for pure Python ones.

Thoughts?  (Please cc me!)

   Julian

-- 
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-

 Julian Gilbey, Dept of Maths, Debian GNU/Linux Developer
  Queen Mary, Univ. of London see http://people.debian.org/~jdg/
   http://www.maths.qmul.ac.uk/~jdg/   or http://www.debian.org/
Visit http://www.thehungersite.com/ to help feed the hungry




Re: policy 2.3 para 2

2002-02-10 Thread Julian Gilbey
On Sun, Feb 10, 2002 at 02:08:56PM +1100, Donovan Baarda wrote:
 On Sat, Feb 09, 2002 at 07:41:31PM +, Julian Gilbey wrote:
  I have a suggestion, which may already have been thought of.
  
  Need: a python-module (pure Python) providing package should provide
  byte-compiled versions for all installed python versions (as long as
  there are no version dependency issues)
 
 Actually, it is slightly more complex than this. For starters, forgetting
 that not all python modules are compatible with all versions of python,
 there is a second need;

Not a problem.

 Need-2: installing a new version of Python should byte-compile all
 pre-installed Python modules for the newly installed Python version.

That's the whole cleverness of the emacsen solution -- see below.

 This makes it a bit tricky because it's hard to know where the compile all
 for all scripts should go; in the versioned Python packages, in the
 Python default wrapper, or in the modules themselves. The moment you think
 it's obvious, you dig deeper and find it introduces some dependancy nastys.

Quite -- but it's really good that the emacsen team have already
solved this problem, so the python team doesn't have to reinvent the
wheel.

 I'm not 100% sure of the details of the emacsen approach, but doesn't it use
 some sort of module-registration database? I can't help but think that it's
 a bit sad that you need to introduce _another_ database of installed stuff
 when you already have the dpkg database. However, perhaps that's the only
 way to get a truely perfect solution.

Yes, emacsen-common reads a database of install/remove command files.
See the /usr/lib/emacsen-common directory.  It allows the packages to
be a little bit clever for a relatively small maintainer cost, as the
extra files are mostly boilerplate.

This allows installing/removing emacs packages to be performed
orthogonally to installing/removing different versions of emacs, and
means that any version-independent emacs packages can work with all
installed emacsen without having to be aware of which particular ones
are installed, and without having to have a separate version for each
emacs package.

   Julian

-- 
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-

 Julian Gilbey, Dept of Maths, Debian GNU/Linux Developer
  Queen Mary, Univ. of London see http://people.debian.org/~jdg/
   http://www.maths.qmul.ac.uk/~jdg/   or http://www.debian.org/
Visit http://www.thehungersite.com/ to help feed the hungry




Re: policy 2.3 para 2

2002-02-11 Thread Julian Gilbey
On Mon, Feb 11, 2002 at 11:28:50AM +0100, Bastian Kleineidam wrote:
 On Sat, Feb 09, 2002 at 07:41:31PM +, Julian Gilbey wrote:
  I have a suggestion, which may already have been thought of.
 
 For Python Policy 2.2.3, see
 http://lists.debian.org/debian-python/2002/debian-python-200201/msg00019.html
 I am working on this. I have the previous version online at
 http://people.debian.org/~calvin/purepython/
 which uses debhelper scripts.
 
 Indeed this weekend I have been working on the new version, which
 is similar to emacsen-common (per-package DB entries etc.).
 I still have to test the scripts a little, so expect them to be online
 in the next 2-3 days.

Super!  I'll check it out when I have a moment.

  Why not take the emacsen-common method and code and use this for
  python?  It probably won't work for C-extension modules, but it could
  make life easier for pure Python ones.
 I have looked into emacsen-common scripts, but I dont like programming in
 Perl, its a pain for me.
 Anyway it would be an offense if a Python-related package was written
 in Perl :)

But of course!  However, once it's translated into a sensible
language, the principles should be fine ;-)

   Julian

-- 
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-

 Julian Gilbey, Dept of Maths, Debian GNU/Linux Developer
  Queen Mary, Univ. of London see http://people.debian.org/~jdg/
   http://www.maths.qmul.ac.uk/~jdg/   or http://www.debian.org/
Visit http://www.thehungersite.com/ to help feed the hungry




Re: policy 2.3 para 2

2002-02-12 Thread Julian Gilbey
On Mon, Feb 11, 2002 at 10:19:56PM +1100, Donovan Baarda wrote:
   Why not take the emacsen-common method and code and use this for
   python?  It probably won't work for C-extension modules, but it could
   make life easier for pure Python ones.
 
 Just a thought; since we already have a database of packages in the form of
 the dpkg database, is it possible to do something simple using it instead of
 introducing some other database? As an example, can a python package call
 dpkg-reconfigure for all installed packages that depend on python in
 its postinst script?

You have two cases to consider:

(1) Installing/removing pythonX.Y package

(2) Installing/removing a python-depending package

Your suggestion will perhaps work for the former but not for the
latter: every python package which wishes to use this scheme will have
to have postinst code anyway.

It may be that as python is simpler, we can simply have a script in a
python-common package which does something like (pardon me if I get
the python a little wrong) -- this could be rewritten in python, of
course, although there's a little question of which python version
would be used

#! /bin/sh

Check to see whether there are any arguments saying to limit the
versions of python used or something similar

while [ $# -gt 0]; do
for pyth in /usr/bin/python[12].[0-9]; do
cd /usr/lib/`basename $pyth`
ln -s ../python-common/$1 .
$pyth -c import py_compile; py_compile.compile('$1');
  py_compile.compile('$1','${1}o');
done
done

and a similar one for a package removal.  Then on installation, a
python-module providing package could say in its postinst:

debian-python-compile foo.py bar.py wibble.py wobble.py

and install these modules into /usr/lib/python-common.

Or alternatively, just install them all into /usr/lib/python-common
and run python-update-install, which is essentially the same as the
above but doesn't take filename arguments and does the following
instead:

for pyth in /usr/bin/python[12].[0-9]; do
cd /usr/lib/`basename $pyth`
for module in /usr/lib/python-common/*.py; do
module=`basename $module`
if [ -L $module ]; then continue; fi
ln -s ../python-common/$module .
$pyth -c import py_compile; py_compile.compile('$module');
  py_compile.compile('$module','${module}o');
done
done

However, that doesn't handle the case of removing packages.  This
could be done using dpkg as you suggest: in the prerm, call
python-update-remove $pkg, and this does the following:

for pyth in /usr/bin/python[12].[0-9]; do
cd /usr/lib/`basename $pyth`
for module in dpkg -L $pkg | grep '^/usr/lib/python-common/.*\.py$'; do
module=`basename $module`
if [ -L $module ]; then
rm -f $module ${module}c ${module}o
fi
done
done

Now that's seriously easier than the emacsen-common solution, and is
possibly adequate for this situation.

   Julian

-- 
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-

 Julian Gilbey, Dept of Maths, Debian GNU/Linux Developer
  Queen Mary, Univ. of London see http://people.debian.org/~jdg/
   http://www.maths.qmul.ac.uk/~jdg/   or http://www.debian.org/
Visit http://www.thehungersite.com/ to help feed the hungry




Joining DPMT

2019-11-17 Thread Julian Gilbey
Hello,

I currently maintain send2trash, and it was suggested to me by Sandro
Tosi that I join the DPMT to team-maintain it.  That sounds like an
excellent idea to me!

My salsa login is jdg

I have read and accept the DPMT policy.

Best wishes,

   Julian



Re: Don't add me to your debian/control Uploaders when it is untrue

2021-04-06 Thread Julian Gilbey
On Fri, Apr 02, 2021 at 10:55:10AM -0700, Otto Kekäläinen wrote:
> Hello Julian!
> I noticed that you've added me to several packages as the uploader:
> debian/control:
>     Uploaders: Otto Kekäläinen ,
> Please do not do it. I have not reviewed/uploaded (to my knowledge) any of 
> those
> packages and it is wrong to list me there as an uploader. Do not do this for 
> new
> packages and remove me from the old packages next time you upload a new 
> version
> (not urgent).

Dear Otto,

I'm sorry - I copied the control file for one of the packages I
started working on to get Spyder 4.x into bullseye, and it had you as
an uploader.  Apologies for including you without your agreement.  I'm
totally willing to remove your name from the uploaders field after
bullseye has released, but I'm now not sure which packages should and
shouldn't have your name listed.  Here's a list of the packages I've
uploaded which I think have your name in them; I will remove you from
all of them unless you say otherwise:

abydos
distance
paq
pyls-black
pyls-spyder
pylzss
py-stringmatching
python-jsonrpc-server
python-language-server
pyxdameraulevenshtein
syllabipy
textdistance
three-merge

Best wishes,

   Julian



Re: Debian python-language-server package forks

2021-09-19 Thread Julian Gilbey
On Fri, Sep 17, 2021 at 11:57:44PM +0200, Jochen Sprickerhof wrote:
> Hi,
> 
> the Python language server packages where recently forked upstream (because
> the developers lost access to the original repos) and the development
> shifted there. I started working on new Debian packages for those (cf.
> #994007):
> [...]
> 
> I'm also planning to package these pylsp plugins: pylsp-mypy, pyls-flake8,
> pyls-isort and will fill ITPs for them soon.
> 
> Cheers Jochen

Hi Jochen,

That's fantastic, thanks so much!  I won't have any time to think
about this for the next two weeks, but once they're in unstable, I'll
look at upgrading Spyder to version 5.x.

Best wishes,

   Julian



Installing data files with pybuild

2021-12-01 Thread Julian Gilbey
Hello!

pybuild is magic.  It knows where the build directory is, despite it
seemingly calling setup.py with no command line arguments specifying
it!

But anyway, my actual question is this.  How do I ensure that the data
files are also copied into the package?  My package has a source tree
that looks roughly like this:

setup.py
setup.cfg
MANIFEST.in
...
debian/
mypkg/
  __init__.py
  module1.py
  imgs/
img1.png
img2.png
  data/
data.csv
data2.txt
  module2/
__init__.py
foo.py
module2_data.txt

Only the .py files are currently included in the build; what is the
best way to include all of the data files after the build step and
before the test step, and then to ensure they are included in the
final package?

Thanks!

   Julian



Re: Installing data files with pybuild

2021-12-01 Thread Julian Gilbey
On Wed, Dec 01, 2021 at 01:32:38PM -0500, Louis-Philippe Véronneau wrote:
> On 2021-12-01 12 h 28, Andrey Rahmatullin wrote:
> > 
> >> Only the .py files are currently included in the build; what is the
> >> best way to include all of the data files after the build step and
> >> before the test step, and then to ensure they are included in the
> >> final package?
> > Apart from fixing setup.py? execute_after_dh_auto_install I guess.
> 
> Or you can use debian/foobar.install to install the missing files in the
> right location, and keep your d/rules file cleaner :)
> 
> ex. (man dh_install):
> 
> https://sources.debian.org/src/sublime-music/0.11.16-1/debian/sublime-music.install/

Thanks Andrey and Louis-Philippe!

Yes, that might work, though the build-time tests might still fail as
I think they expect these files to be present at test time.  Since
pybuild uses the library pre-installation to perform the build-time
tests, there might be a problem here.

I don't understand why the files are not correctly installed by
setup.py; it has set include_package_data=True, which includes the
files (and file patterns) listed in MANIFEST.in, but it seems that
this is only used for the bdist and sdist targets of setup.py, but not
for the install target.

There is another possibility, which is to use the --after-build
pybuild option or the PYBUILD_AFTER_BUILD environment variable.  There
is a discussion of this in the scikit-image package here:
https://github.com/scikit-image/scikit-image/issues/4366 which links
to the actual code in debian/rules:
https://salsa.debian.org/science-team/scikit-learn/blob/37a7b4eda8b15f514c6d0f6f801c2d67e35e36c7/debian/rules#L77-84
But I was wondering whether there is a "nicer" way to do it.

Thanks!

   Julian



Bug#1005701: ITP: python-untangle -- Convert XML to a Python object

2022-02-13 Thread Julian Gilbey
Package: wnpp
Severity: wishlist
Owner: Julian Gilbey 
X-Debbugs-Cc: debian-de...@lists.debian.org, debian-python@lists.debian.org

* Package name: python-untangle
  Version : 1.1.1+git20200807.fb916a9
  Upstream Author : Christian Stefanescu 
* URL : https://github.com/stchris/untangle
* License : MIT
  Programming Lang: Python
  Description : Convert XML to a Python object

 Untangle has a very simple API for converting XML to a Python object:
 you just call the "parse" function on either a string, a filename or a URL.


This package is used in the tests for pydevd, which I am currently
working on packaging
(https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933070).

It will be maintained within the Debian Python Team.

Best wishes,

   Julian



Re: Advice wanted: handling weird vendoring situation

2022-02-16 Thread Julian Gilbey
On Sun, Feb 13, 2022 at 12:07:44PM +0100, Gregor Riepl wrote:
> > So the solution I'm currently in the process of trying is to copy the
> > version from the oldest supported Python version at Debian package
> > build time.  We'll see how this goes
> > 
> >> Perhaps they have a maintenance script for updating the vendored
> >> dependencies? You could use that to find out how to reverse the changes,
> >> or start from a clean slate?
> > 
> > Unlikely; some of their vendored dependencies date back to Python 2.5!
> 
> In that case, I think this is the issue that must be solved first:
> Ensuring that their code is compatible with the *latest* published
> version, and vendoring the system version at build time.

Hi Gregor,

Thanks for your thoughts.

Yes, and that is what I'm going with.  Unfortunately, upstream are
trying to keep pydevd compatible with almost every version of Python
that exists (Python 2.7, Python 3.x, PyPy, IronPython and Jython), so
they cannot necessarily use the latest versions of the library.  But
I'm only going to aim to support the supported versions of CPython in
Debian, so I can use the latest versions of the library.  (Actually, I
am using the version in the lowest Debian-supported version of Python,
in case a newer version of the library uses newly introduced syntax.)

> Not a good solution, since it will still leave the system vulnerable
> when one of the dependencies gets a security update, but better than
> shipping a static version that might have numerous issues and will no
> longer receive any patches.

Indeed.

> The alternative would be that they take full responsibility for their
> vendored code, but then it will be much harder to detect when they're
> affected by a vulnerability.

Quite.

Best wishes,

   Julian



Re: Advice wanted: handling weird vendoring situation

2022-02-12 Thread Julian Gilbey
On Wed, Feb 09, 2022 at 07:39:40PM +0100, Gregor Riepl wrote:
> > I realise now that this "nice" solution won't work, as the standard
> > library code says:
> > 
> > import socketserver
> > 
> > so modifying sys.path will just change the value of
> > sys.modules["socketserver"].  However, the vendored code instead loads
> > this module to sys.modules["_pydev_imps._pydev_SocketServer"] or
> > something like that, deliberately avoiding interfering with
> > sys.modules["socketserver"].
> 
> It seems to me that the "correct" solution would be to motivate upstream
> not to vendor anything in their source tree. If they really need
> vendoring to avoid compatibility issues with various environments, they
> should do so only when building releases. It still wouldn't solve the
> problem of incompatible system modules, but at least it would make it
> clearer which versions they require and why.

Hi Gregor,

Thanks!

And indeed, that would be a good idea, but I doubt it's going to
happen :( They don't need "vendoring" as far as I can tell, but they
just need a pristine copy of the system library that can be loaded
independently of the system library, so that if gevent patches the
system library at runtime, this package still has access to an
unpatched copy.

So the solution I'm currently in the process of trying is to copy the
version from the oldest supported Python version at Debian package
build time.  We'll see how this goes

> Perhaps they have a maintenance script for updating the vendored
> dependencies? You could use that to find out how to reverse the changes,
> or start from a clean slate?

Unlikely; some of their vendored dependencies date back to Python 2.5!

Best wishes,

   Julian



Re: Please make a separate package for mistune 2.x

2022-02-04 Thread Julian Gilbey
On Thu, Feb 03, 2022 at 11:34:26PM +0100, Pierre-Elliott Bécue wrote:
> Hi Michael,
> 
> > Since Mistune 2.0.0 regresses its support for standard markdown, I ask
> > that a separate package be made for mistune 2.x to give more time for 
> > mistune 0.8.x users to migrate to mistune 2.x or another library entirely.
> >
> > Cheers,
> 
> I'm not formally against it, but it's not really standard in my
> opinion. It'd lead to maintenance of two packages, probably on some long
> term (as it'd relieve the pressure to migrate for maintainers of
> reverse-deps of mistune).
> 
> Besides, having a source package named "mistune2" while upstream's
> package is named "mistune" adds also another layer of complexity.
> 
> I'd like to have some python team members' opinion on this, and I am not
> sure to be eager to do it as of now.

I agree that this sounds like a terrible idea.  What would the Python
module be called in a separate python3-mistune2 package be called?  If
it were called mistune, then python3-mistune2 would have to conflict
with python3-mistune (or "python3-mistune0.84" or similar), and there
would be little benefit.  If the module itself were renamed to
mistune2, then the Debian mistune package would be incompatible with
any other software requiring mistune.

Basically, the mistune upstream author has completely messed up on
this by making what is essentially a completely different package with
superficially similar functionality but the same name.

What the Debian nbconvert maintainers have done is to vendor mistune
0.8.4: they now include it as _mistune.py within the Debian package,
and have nbconvert do "import nbconvert.filters._mistune as mistune"
(see /usr/lib/python3/dist-packages/nbconvert/filters/markdown_mistune.py).
That seems like an eminently sensible solution to this problem.  Maybe
not ideal, but it will work until the upstream maintainers find a way
to work with mistune 2.0.x.

Best wishes,

   Julian



Re: Please make a separate package for mistune 2.x

2022-02-04 Thread Julian Gilbey
On Fri, Feb 04, 2022 at 09:27:59PM +0530, Nilesh Patra wrote:
> On 2/4/22 9:18 PM, Julian Gilbey wrote:
> > Basically, the mistune upstream author has completely messed up on
> > this by making what is essentially a completely different package with
> > superficially similar functionality but the same name.
> 
> True.
> > [...]
> > _mistune.py within the Debian package,
> > and have nbconvert do "import nbconvert.filters._mistune as mistune"
> > (see /usr/lib/python3/dist-packages/nbconvert/filters/markdown_mistune.py).
> > That seems like an eminently sensible solution to this problem.
> 
> But that'd lead to a number of mistune's embedded copies in a huge number of 
> packages; since majority of
> the rev-deps (when I last checked) haven't adapted to this new version. When 
> they do,
> and it becomes a overhead to fix each one later.
> Even worse, if we discover a security problem sometime later, then all such 
> packages would be
> effected, and that honestly does not look like a good idea to me.

This is true, though there are only 7 reverse dependencies currently
in testing.

> I somehow do not understand the urgency of uploading this newer version, as 
> the maintainer said:
> 
> | I intend to upload src:mistune 2.0.0 to unstable between March the
> | 15th and April the 15th (depending on the progress of its
> | reverse-dependencies).
> 
> We could simply wait a little more for the dust to settle, IMHO.

That would be a reasonable approach, but how long will it take for the
dust to settle?  With this major change, and no guidance from upstream
on how to migrate, and at least a number of upstream authors happy to
rely on setup.py having "mistune <1.0.0" in the install_requires
field, it might be several months or longer before things are fixed
upstream.  And what do we do when some packages have converted and
some haven't?

Best wishes,

   Julian



Bug#1004746: lintian: provide a check for Python package version numbers validity

2022-02-01 Thread Julian Gilbey
Package: lintian
Version: 2.114.0
Severity: wishlist
X-Debbugs-Cc: Debian Python Team 

I just hit two packages which gave me the following warning when
pkg_resources tried to load them:

/usr/lib/python3/dist-packages/pkg_resources/__init__.py:116: 
PkgResourcesDeprecationWarning: 1.12.1-git20200711.33e2d80-dfsg1-0.6 is an 
invalid version and will not be supported in a future release
  warnings.warn(

(and a different version number in the other case).  The upstream
Python developers have a clear idea of what is accepted as a version
number, and it appears in
/usr/lib/python3/dist-packages/pkg_resources/_vendor/packaging/version.py
(in the python3-pkg-resources package) in the definition of
VERSION_PATTERN.

The version number that is being examined is that stored in
/usr/lib/python3/dist-packages/*.egg-info or
/usr/lib/python3/dist-packages/*.egg-info/PKG-INFO on the line
beginning "Version: ".

This appears to be a fairly rare bug: only two packages on my system
have this issue (and I've just reported bugs against them).
Nonetheless, if it is easy, it would be nice to have a lintian test
for it.

Best wishes,

   Julian



Advice wanted: handling weird vendoring situation

2022-02-07 Thread Julian Gilbey
Hi,

I'm working towards packaging pydevd (which is a recursive dependency
of Spyder via IPython), and it's a somewhat challenging package!  I
have hit an issue and would appreciate some thoughts on how best to
handle it.

Background:

pydevd is a debugging package which can attach to running scripts.  It
is used by PyDev, PyCharm, VSCode and Spyder, and with Spyder, it is
imported through debugpy, which in turn is imported into IPython.

In order to ensure that the libraries it is using have not been
modified by gevent, it uses vendored copies of various standard
library modules rather than just saying "import inspect" etc.

I thought I could address this issue by replacing the vendored copies
of the library modules by symlinks to /usr/lib/python3.X/, but now
I've hit another snag: some of these modules import other modules.
For example:

pydev_imps/_pydev_SimpleXMLRPCServer.py
is a very old version of /usr/lib/python3.X/xmlrpc/server.py.  It
contains within it the following lines:

from _pydev_imps import _pydev_xmlrpclib as xmlrpclib
from _pydev_imps._pydev_xmlrpclib import Fault
from _pydev_imps import _pydev_SocketServer as SocketServer
from _pydev_imps import _pydev_BaseHTTPServer as BaseHTTPServer

These libraries are:
_pydev_imps._pydev_xmlrpclib -> xmlrpc.client
_pydev_imps._pydev_SocketServer -> socketserver
_pydev_imps._pydev_BaseHTTPServer -> http.server

So what should I do?

One solution is just to symlink from _pydev_SimpleXMLRPCServer.py to
/usr/lib/python3.X/xmlrpc/server.py and not worry about the other
modules.  But that might break things in non-obvious ways, so I don't
want to do that.

Another possible solution is to update all of the vendored copies in
_pydev_imps using the relevant /usr/lib/python3.X/* modules and making
the same modifications to the imports to load local copies.  But then
we will have duplicate copies of standard library modules, which I
also don't want to do.

Perhaps another possibility is to have symlinks in the _pydev_imps
directory to the standard library versions and then temporarily modify
sys.path to look in _pydev_imps before looking in standard locations.
I don't know whether this will work, though.

(There is another snag, too, which is that the path will depend on the
Python version.  So I will probably have
_pydev_imps/python_39/socketserver.py -> /usr/lib/python3.9/socketserver.py
and
_pydev_imps/python_310/socketserver.py -> /usr/lib/python3.10/socketserver.py
But that's a simpler issue.)

Any thoughts on these ideas would be much appreciated!

Best wishes,

   Julian



Re: Please make a separate package for mistune 2.x

2022-02-05 Thread Julian Gilbey
On Fri, Feb 04, 2022 at 09:27:59PM +0530, Nilesh Patra wrote:
> On 2/4/22 9:18 PM, Julian Gilbey wrote:
> > [...]
> > _mistune.py within the Debian package,
> > and have nbconvert do "import nbconvert.filters._mistune as mistune"
> > (see /usr/lib/python3/dist-packages/nbconvert/filters/markdown_mistune.py).
> > That seems like an eminently sensible solution to this problem.
> 
> But that'd lead to a number of mistune's embedded copies in a huge number of 
> packages; since majority of
> the rev-deps (when I last checked) haven't adapted to this new version. When 
> they do,
> and it becomes a overhead to fix each one later.
> Even worse, if we discover a security problem sometime later, then all such 
> packages would be
> effected, and that honestly does not look like a good idea to me.

I have just had another idea, which might solve all of the problems:
create a new Debian package called mistune0 (or mistune1), which
contains the legacy version of mistune, but with the Python module
called "mistune0" instead of "mistune".  Then it will be
co-installable with mistune 2.x, and the packages which still depend
on mistune 0.8.4 could be patched to say "import mistune0 as mistune"
until they are updated upstream.  This will also avoid having multiple
copies of the legacy code in the archive, which addresses the security
issue, and allow those packages which have migrated to mistune 2.x to
still say "import mistune".

Best wishes,

   Julian



Re: Advice wanted: handling weird vendoring situation

2022-02-07 Thread Julian Gilbey
On Mon, Feb 07, 2022 at 09:27:28PM +, Julian Gilbey wrote:
> [...]
> 
> I thought I could address this issue by replacing the vendored copies
> of the library modules by symlinks to /usr/lib/python3.X/, but now
> I've hit another snag: some of these modules import other modules.
> For example:
> 
> pydev_imps/_pydev_SimpleXMLRPCServer.py
> is a very old version of /usr/lib/python3.X/xmlrpc/server.py.  It
> contains within it the following lines:
> 
> from _pydev_imps import _pydev_xmlrpclib as xmlrpclib
> from _pydev_imps._pydev_xmlrpclib import Fault
> from _pydev_imps import _pydev_SocketServer as SocketServer
> from _pydev_imps import _pydev_BaseHTTPServer as BaseHTTPServer
> [...]
> Perhaps another possibility is to have symlinks in the _pydev_imps
> directory to the standard library versions and then temporarily modify
> sys.path to look in _pydev_imps before looking in standard locations.
> I don't know whether this will work, though.

I realise now that this "nice" solution won't work, as the standard
library code says:

import socketserver

so modifying sys.path will just change the value of
sys.modules["socketserver"].  However, the vendored code instead loads
this module to sys.modules["_pydev_imps._pydev_SocketServer"] or
something like that, deliberately avoiding interfering with
sys.modules["socketserver"].

Ho hum.

   Julian



Re: Bug#1005043: lintian: check that Python version numbers are not 0.0.0

2022-02-07 Thread Julian Gilbey
On Tue, Feb 08, 2022 at 12:26:01AM +, Stefano Rivera wrote:
> Hi Julian (2022.02.07_06:26:38_+)
> > I'm a little confused by this.  Have a look at
> > https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1005039 against
> > python3-iniconfig.  It has a very straightforward debian/rules, using
> > pybuild, and its setup.py script has "use_scm_version=True", but it
> > still produces a python package with version number 0.0.0.
> > 
> > I have tried this in an environment where I have
> > python3-setuptools-scm installed, by the way (even though the package
> > does not Build-Depends on it).  I'm using dh-python version 5.20220119
> 
> That's the issue, it *needs* to Build-Depend on that (pybuild only
> exports the PRETEND environment variable when there is a
> Build-Dependency).
> 
> Committed a fix to git.

Ah, thanks Stefano!

Best wishes,

   Julian



Re: Bug#1005043: lintian: check that Python version numbers are not 0.0.0

2022-02-06 Thread Julian Gilbey
On Sat, Feb 05, 2022 at 04:42:57PM -0500, Sandro Tosi wrote:
> > The test for this bug (and it should probably be recorded as an error,
> > not just a warning, as no Python package should have a version number
> > of 0.0.0)
> 
> what exactly is the problem that would make it an 'error'?

When a package uses pkg_resources to determine the version number of
some package, it is returned the wrong information.  This is certainly
a packaging error: the upstream package, as installed by pip,
announces "this is version 1.2.3" but the Debian package announces
"this is version 0.0.0".

In the couple of cases I've looked at so far, it is due to the
upstream version using use_scm_version in setup.py.  This works fine
for a version that is in a Git repository, but it doesn't work for
Debian packages, as the Git version lookup fails.  So this needs to be
patched.

Perhaps a better way would be for dh_python3 to handle this by
"teaching" use_scm_version to look at debian/changelog, as this would
save 30+ packages having to continually update a setup.py patch.

What do you think?

Best wishes,

   Julian



Bug#1005043: lintian: check that Python version numbers are not 0.0.0

2022-02-05 Thread Julian Gilbey
Package: lintian
Version: 2.111.0
Severity: wishlist
X-Debbugs-Cc: debian-python@lists.debian.org

I just ran into several Python packages that install modules with
version number 0.0.0 because of some issue with their setup.py
scripts.  I just did the following on my testing system:

lz4cat 
/var/lib/apt/lists/deb.debian.org_debian_dists_testing_main_Contents-all.lz4 | 
grep 'usr/lib/python3/dist-packages/.*-0\.0\.0\..*-info/PKG-INFO' | wc -l
24

lz4cat 
/var/lib/apt/lists/deb.debian.org_debian_dists_testing_main_Contents-all.lz4 | 
grep 'usr/lib/python3/dist-packages/.*-0\.0\.0\..*-info ' | wc -l
6

So there are at least about 30 packages with this problem.

The test for this bug (and it should probably be recorded as an error,
not just a warning, as no Python package should have a version number
of 0.0.0) is simple: if the binary package contains has a file or
directory with the name as in the above regex, then the package has
this error.

Best wishes,

   Julian



Re: Bug#1005043: lintian: check that Python version numbers are not 0.0.0

2022-02-06 Thread Julian Gilbey
On Sun, Feb 06, 2022 at 04:46:53PM +, Stefano Rivera wrote:
> Hi Julian (2022.02.06_12:19:54_+)
> > In the couple of cases I've looked at so far, it is due to the
> > upstream version using use_scm_version in setup.py.  This works fine
> > for a version that is in a Git repository, but it doesn't work for
> > Debian packages, as the Git version lookup fails.  So this needs to be
> > patched.
> 
> Or export SETUPTOOLS_SCM_PRETEND_VERSION.
> https://github.com/pypa/setuptools_scm#environment-variables
> 
> pybuild does this for you.

Hi Stefano,

I'm a little confused by this.  Have a look at
https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1005039 against
python3-iniconfig.  It has a very straightforward debian/rules, using
pybuild, and its setup.py script has "use_scm_version=True", but it
still produces a python package with version number 0.0.0.

I have tried this in an environment where I have
python3-setuptools-scm installed, by the way (even though the package
does not Build-Depends on it).  I'm using dh-python version 5.20220119

Best wishes,

   Julian



Bug#1005267: ITP: python-bytecode -- Python module to generate, modify and optimize Python bytecode

2022-02-09 Thread Julian Gilbey
Package: wnpp
Severity: wishlist
Owner: Julian Gilbey 
X-Debbugs-Cc: debian-de...@lists.debian.org, debian-python@lists.debian.org

* Package name: python-bytecode
  Version : 0.13.0
  Upstream Author : Victor Stinner  and
Matthieu C. Dartiailh 
* URL : https://github.com/MatthieuDartiailh/bytecode
* License : MIT
  Programming Lang: Python
  Description : Python module to generate, modify and optimize Python 
bytecode

The bytecode module can be used to write Python bytecode directly and
then convert it into executable Python statements.  It also provides a
pure Python implementation of the Peephole Optimizer introduced in
CPython 3.6.


This Python module was vendored by the pydevd developers, and having a
standalone version of the module will avoid having the embedded module
in pydevd.  (And pydevd turns out to be a recursive dependency of
Spyder)  The Debian version of this package will therefore require
the pydevd patch; this enhances the bytecode functionality if it is
used (via an optional argument) but has no effect otherwise.

It will be maintained within the Debian Python Team.



Re: Reaching team consensus on usage of py3versions -r and X-Python3-Version in Lintian

2022-01-19 Thread Julian Gilbey
On Tue, Jan 18, 2022 at 03:37:13PM +0100, Thomas Goirand wrote:
> On 1/17/22 18:47, Louis-Philippe Véronneau wrote:
> > Hey folks,
> > 
> > I'm following up on bug #1001677 [1] on the DPT's list to try to reach
> > consensus, as I think the Lintian tags that were created to fix this bug
> > are not recommending the proper thing.
> > 
> > As a TL;DR for those of you who don't want to read the whole BTS thread,
> > jdg saw that a bunch of packages were using `py3versions -r` in
> > autopkgtests, and this fails when there's no X-Python3-Version variable
> > in d/control.
> > 
> > The fix that Lintian now proposes for packages that use `py3versions -r`
> > in autopkgtests is to set X-Python3-Version.
> > 
> > I think the proper fix would be to ask people to move away from
> > `py3versions -r` if there is no X-Python3-Version, and use`py3versions
> > -s` instead.
> > [...]

Dear all,

The lintian maintainers have indeed taken this suggestion on board and
have made most this change (modulo some minor wording differences):
the current wording is attached below.  I wonder whether with the use
of py3versions --supported with X-Python3-Version present, the message
should recommend removing the X-Python3-Version field as the preferred
option, as it does with the other case?


On some of the more general points, I scanned through every source
package in unstable and found the following:

* 6 packages use py3versions -d (--default) in their test suite; this
  is clearly an error in some packages (brial, ogre, sfepy, sinntp),
  and possibly an error in two others (numpy, scipy) - I haven't
  looked in detail at the last two

* Quite a few packages use py3versions -i (--installed) in their test
  suite; this is almost certainly wrong (though it probably does the
  right thing in an autopkgtest if it depends on python3-all)

As far as X-Python3-Version goes:

* 5 packages have X-Python3-Version: ${python3:Versions}, which is
  surely wrong?

* 2 packages have commented-out X-Python3-Version lines; these could
  probably just be removed

* about 166 packages have X-Python3-Version: >= 3.x, where 0 <= x <=
  3.7.  All of these packages would work in oldstable (buster) or
  later without this line

* 8 packages have >= 3.8 and 3 packages have >= 3.9; these would work
  in stable (bullseye) or later without this line

* 6 packages have just 3.9 (calibre, pytorch-audio, pytorch-ignite,
  pytorch-text, pytorch-vision, skorch); these will no longer work in
  testing (bookworm) once Python has fully migrated to 3.10.

* 6 packages have "all" or "current"; I have filed bug reports on
  these.

So of the entire archive, only 6 packages have a correct and current
use of X-Python3-Version.


Here is the current Lintian wording:

For use of py3versions --requested/-r without a corresponding
X-Python3-Version:
https://salsa.debian.org/lintian/lintian/-/blob/master/tags/d/declare-python-versions-for-test.tag

Tag: declare-python-versions-for-test
Severity: warning
Check: testsuite
Renamed-from:
 declare-requested-python-versions-for-test
Explanation: The specified test attempts to query the Python versions
 requested by your sources with the command
 py3versions --requested but your sources do not declare
 any versions with the field X-Python3-Version.
 .
 Please choose between two suggested remedies:
 .
 In most circumstances, it is probably best to replace the argument
 --requested with --supported. That will
 exercise the test with all available Python versions.
 .
 Should your installable require only specific Python versions, please add
 the field X-Python3-Version with the appropriate information
 to the source stanza in the debian/control file.
 .
 No redirection of the output, as in 2  /dev/null, is
 needed in either case.
See-Also:
 py3versions(1),
 Bug#1001677


For the use of py3versions --supported/-s with the presence of
X-Python3-Version (a rare occurrence):
https://salsa.debian.org/lintian/lintian/-/blob/master/tags/q/query-declared-python-versions-in-test.tag

Tag:  query-declared-python-versions-in-test
Severity: warning
Check: testsuite
Renamed-From:
 query-requested-python-versions-in-test
Explanation: The specified test queries all supported Python versions
 with the command py3versions --supported but your sources request
 a specific set of versions via the field X-Python3-Version.
 .
 Please query only the requested versions with the command
 py3versions --requested.
See-Also:
 py3versions(1),
 Bug#1001677


Best wishes,

   Julian



PyQt5 question: why are QOpenGLTimeMonitor/QOpenGLTimerQuery not defined on armhf?

2022-04-07 Thread Julian Gilbey
Well, the new version of qtpy failed the CI checks on armhf, and I
don't understand why.  The cause is that the following caused an
error:

(sid_armhf-dchroot)jdg@abel:~$ python3
Python 3.10.4 (main, Apr  2 2022, 09:04:19) [GCC 11.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from PyQt5.QtGui import QOpenGLTimeMonitor
Traceback (most recent call last):
  File "", line 1, in 
ImportError: cannot import name 'QOpenGLTimeMonitor' from 'PyQt5.QtGui' 
(/usr/lib/python3/dist-packages/PyQt5/QtGui.abi3.so)
>>> from PyQt5.QtGui import QOpenGLTimerQuery
Traceback (most recent call last):
  File "", line 1, in 
ImportError: cannot import name 'QOpenGLTimerQuery' from 'PyQt5.QtGui' 
(/usr/lib/python3/dist-packages/PyQt5/QtGui.abi3.so)

Why are these two classes not defined on armhf?  I had a quick look at
the source code for PyQt5, and I see the following:

sip/QtGui/qopengltimerquery.sip, lines 23-:
%If (Qt_5_1_0 -)
%If (PyQt_Desktop_OpenGL)

class QOpenGLTimerQuery : QObject
{
%TypeHeaderCode
#include 
%End
[...]

So I presume that if PyQt_Desktop_OpenGL is defined, then this is
included.  This would be set in config-tests/cfgtest_QtGui.cpp, lines
28-:

#if defined(QT_NO_OPENGL)
out << "PyQt_OpenGL\n";
out << "PyQt_Desktop_OpenGL\n";
#elif defined(QT_OPENGL_ES) || defined(QT_OPENGL_ES_2) || 
defined(QT_OPENGL_ES_3)
out << "PyQt_Desktop_OpenGL\n";
#endif

But nowhere do I find any of QT_NO_OPENGL or QT_OPENGL_ES* defined in
the package (but then maybe I'm not looking in the right place?  So I
don't know why these two classes are included in the amd64 version of
the package but not the armhf version.

I could just protect the two imports with a try/except block, but I
would like to know that this is the right thing to do first!

Thanks,

   Julian



Re: QtPy (python3-qtpy) dependencies

2022-04-07 Thread Julian Gilbey
Hi Ghis,

On Thu, Apr 07, 2022 at 11:03:22AM +0200, ghisv...@gmail.com wrote:
> Hi Julian,
> 
> Le mercredi 06 avril 2022 à 22:01 +0100, Julian Gilbey a écrit :
> > I've just uploaded the latest version of QtPy (source: python-qtpy,
> > binary: python3-qtpy).  But I'm disturbed by the dependency list.
> 
> Thank you for taking care of it.

Pleasure; I needed it for the latest version of Spyder, but it's
turning out to be a little harder than I anticipated!

> > QtPy is a wrapper for PyQt5, PyQt6, PySide2 and PySide6 providing a
> > uniform interface to these different packages.  As such, setup.py
> > etc. do not specify a dependency on any Qt library, but the package
> > is
> > of no use without at least one of them installed.
> 
> You analysis is correct.
> 
> I'll add that, at the time of the initial package submission, only
> PyQt5 and PySide2 were supported and the latter was in an uncertain
> state maintenance-wise. So it serves pretty much as a shim layer for
> "some wrapper around modern Qt" but the only viable option was PyQt5.
> 
> Hopefully, things are completely different now and the alternatives
> have caught up. Which I guess is the origin of your struggle today.

That makes a lot of sense!

> > At present, the Debian python3-qtpy package depends on 17
> > python3-pyqt5* packages, which seems to be at odds with the intention
> > of the package to be Qt-package agnostic.  It seems that it would be
> > cleaner for python3-qtpy to
> >   Recommends: python3-pyqt5 | python3-pyside2.qtcore
> > or perhaps to Depends: on these, and then if any packages require any
> > more functionality than that provided by python3-pyqt5 or
> > python3-pyside2.qtcore, they should explicitly state the packages
> > they
> > depend on.  But it seems strange that a package depending on
> > python3-qtpy should automatically pull in
> > python3-pyqt5.qttexttospeech, for example.
> 
> Agreed. I suppose the PyQt5 ecosystem has grown since initial
> submission. There are two issues here: hard or soft depends and to
> which core packages.
> 
> Regarding Depends vs Recommends, as you correctly stated before, QtPy
> is useless without a "backend" implementation. Which is why I chose
> Depends with a choice of alternatives. This way, a client package
> depending on QtPy would always get a default implementation, whilst
> having the possibility to override it with its own (but then what's the
> point?).

That makes a lot of sense as well.  At present, it still seems that
PyQt5 is the most standard, but I don't have much of evidence to back
up that hunch.

> Regarding which packages to depend on, that's subject to which subset
> of Qt{5,6} is supported by QtPy today. These might be in a need for an
> update.

It appears that PyQt6 isn't in Debian yet, so that's not really an
option.  Neither is PySide6, so it's just PyQt5 or PySide2.

> > On the other hand, there are 13 packages in testing that depend on
> > python3-qtpy, so they would potentially all require modifications to
> > their dependencies if we made this change.  (Three of these are
> > "mine", but that still leaves 10 that are not.)  I have not yet gone
> > through all 13 to see what python3-pyqt5.* dependencies they actually
> > have.
> 
> I'd be in favour with the least invasive option, which is to still use
> QtPy with Depends on a default implementation but update it with what's
> available today.

I'm happy with sticking with that.  I still think we have too many
dependencies, though; it's become rather a metapackage for PyQt5!  But
it's not a big deal.

> > I'd appreciate thoughts on how to proceed from this group before
> > doing
> > anything.
> 
> Good luck.
> 
> Cheers,
> Ghis

Thanks!

   Julian



QtPy (python3-qtpy) dependencies

2022-04-06 Thread Julian Gilbey
I've just uploaded the latest version of QtPy (source: python-qtpy,
binary: python3-qtpy).  But I'm disturbed by the dependency list.

QtPy is a wrapper for PyQt5, PyQt6, PySide2 and PySide6 providing a
uniform interface to these different packages.  As such, setup.py
etc. do not specify a dependency on any Qt library, but the package is
of no use without at least one of them installed.

At present, the Debian python3-qtpy package depends on 17
python3-pyqt5* packages, which seems to be at odds with the intention
of the package to be Qt-package agnostic.  It seems that it would be
cleaner for python3-qtpy to
  Recommends: python3-pyqt5 | python3-pyside2.qtcore
or perhaps to Depends: on these, and then if any packages require any
more functionality than that provided by python3-pyqt5 or
python3-pyside2.qtcore, they should explicitly state the packages they
depend on.  But it seems strange that a package depending on
python3-qtpy should automatically pull in
python3-pyqt5.qttexttospeech, for example.

On the other hand, there are 13 packages in testing that depend on
python3-qtpy, so they would potentially all require modifications to
their dependencies if we made this change.  (Three of these are
"mine", but that still leaves 10 that are not.)  I have not yet gone
through all 13 to see what python3-pyqt5.* dependencies they actually
have.

I'd appreciate thoughts on how to proceed from this group before doing
anything.

Best wishes,

   Julian



Re: PyQt5 question: why are QOpenGLTimeMonitor/QOpenGLTimerQuery not defined on armhf?

2022-04-15 Thread Julian Gilbey
On Fri, Apr 15, 2022 at 03:01:43PM +0100, Peter Michael Green wrote:
>  But nowhere do I find any of QT_NO_OPENGL or QT_OPENGL_ES* defined in
>  the package (but then maybe I'm not looking in the right place?  So I
>  don't know why these two classes are included in the amd64 version of
>  the package but not the armhf version.
> 
> The reason is that QT on armel/armhf is built for opengl ES rather
> than desktop opengl. I believe the reason for this is historic
> hardware support on armhf systems.

Thanks Peter!

Best wishes,

   Julian



Re: Uncleaned egg-info directory giving lots of bugs about failing to build after successful build

2023-09-07 Thread Julian Gilbey
On Wed, Sep 06, 2023 at 08:05:45AM -0700, Soren Stoutner wrote:
> As a followup question, I have noticed that a lot of packages (including
> electrum, which I have recently started maintaining) ship the egg-info
> directory.  Looking through /usr/lib/python3/dist-packages/, this is common 
> but
> not universal.  Is there any reason to ship this directory or should it be
> removed from the binary packages?

Lots of packages depend on the egg-info directory being present.

$ grep EASY-INSTALL-ENTRY-SCRIPT /usr/bin/*

will give a (probably very long) list of executables that depend on an
egg-info (or equivalent) directory being present.

Best wishes,

   Julian



Re: Bug#1042443: ITP: pathos -- Framework for heterogeneous parallel computing

2023-09-01 Thread Julian Gilbey
Hi Agathe,

On Fri, Sep 01, 2023 at 09:46:00AM +0200, Agathe Porte wrote:
> Hi Julian,
> 
> 2023-07-28 10:59 CEST, Julian Gilbey:
> > Package: wnpp
> > Severity: wishlist
> > Owner: Julian Gilbey 
> > X-Debbugs-Cc: debian-de...@lists.debian.org, Debian Python Team 
> > 
> >
> > * Package name: pathos
> >   Version : 0.3.1
> >   Upstream Contact: Mike McKerns 
> > * URL : https://github.com/uqfoundation/pathos
> > * License : BSD-3-clause
> >   Programming Lang: Python
> >   Description : Framework for heterogeneous parallel computing
> >
> > […]
> >
> >
> > This is a package I've started using; it provides a very effective
> > framework for parallel computing, allowing for constructs that the
> > standard Python library does not support.
> >
> > I will maintain it within the Debian Python Team.
> 
> Like python-ppft, this was already packaged in the Python team but not
> uploaded: https://salsa.debian.org/python-team/packages/python-pathos
> 
> Maybe you can find inspiration in it. I think we should only keep one of
> the two repos in the DPT because it can be confusing to have the same
> package twice.

Oh!  I had not realised.  I didn't see an ITP for this package, and so
I went ahead and did it myself.  So yes, let's delete or archive the
duplicate repos for this and ppft; would you be able to do that?

Now I'm just waiting on upgrades to python3-multiprocessing and
python3-dill before I can upload a source-only version of these new
packages.

Best wishes,

   Julian



Re: Uncleaned egg-info directory giving lots of bugs about failing to build after successful build

2023-08-18 Thread Julian Gilbey
On Fri, Aug 18, 2023 at 09:23:17AM -0400, Scott Talbert wrote:
> On Fri, 18 Aug 2023, Andreas Tille wrote:
> 
> > Am Fri, Aug 18, 2023 at 01:42:53PM +0100 schrieb Julian Gilbey:
> > > I'm sure I'm not the only one who received a whole bunch of bugs
> > > entitled "Fails to build source after successful build" last weekend.
> > > There was one theme common to most of them: the presence of a
> > > *.egg-info directory which was not cleaned by debian/rules clean.
> > > [...]
> 
> It is being worked on:
> https://salsa.debian.org/python-team/tools/dh-python/-/merge_requests/46

Amazing!

Thanks,

   Julian



Uncleaned egg-info directory giving lots of bugs about failing to build after successful build

2023-08-18 Thread Julian Gilbey
I'm sure I'm not the only one who received a whole bunch of bugs
entitled "Fails to build source after successful build" last weekend.
There was one theme common to most of them: the presence of a
*.egg-info directory which was not cleaned by debian/rules clean.

I know the bug report said that this policy is currently under
discussion, but I did get thinking about it.  I imagine that this
particular directory should be the responsibility of dh-python to
clean up, but it may not be sensible to always delete *.egg-info
directories, as they may be present in the orig.tar.gz file.  One
could handle it by manually adding this directory to debian/clean in
each package, but perhaps this should be the default behaviour of
dh-python?

Any thoughts?

Best wishes,

   Julian



Messed up a salsa commit - how best to fix?

2022-04-24 Thread Julian Gilbey
Hi,

Somehow I managed to really mess up a commit to python-qtconsole: the
upstream and pristine-tar branches do not have the upstream/5.3.0
sources (the current ones).  However, there's already an
upstream/5.3.0 tag in the repository, pointing to a commit to the
master branch.

I think the simplest thing to do is to "rewrite history": delete the
head commits to the master branch and the 5.3.0 tags, and then
recommit correctly and force-push to salsa.

Would people be OK with me doing this, or do you have an alternative
suggestion?

Best wishes,

   Julian



Re: Messed up a salsa commit - how best to fix?

2022-04-24 Thread Julian Gilbey
On Sun, Apr 24, 2022 at 10:09:21PM +0200, Geert Stappers wrote:
> On Sun, Apr 24, 2022 at 09:01:02PM +0100, Julian Gilbey wrote:
> > Hi,
> > 
> > Somehow I managed to really mess up a commit to python-qtconsole: the
> > upstream and pristine-tar branches do not have the upstream/5.3.0
> > sources (the current ones).  However, there's already an
> > upstream/5.3.0 tag in the repository, pointing to a commit to the
> > master branch.
> > 
> > I think the simplest thing to do is to "rewrite history": delete the
> > head commits to the master branch and the 5.3.0 tags, and then
> > recommit correctly and force-push to salsa.
> > 
> > Would people be OK with me doing this,
> 
> I'm not OK with rewriting history.
> 
> > or do you have an alternative suggestion?
> 
> Accept the failure, learn from it, move on, make new mistakes, learn from 
> them.
> 
> In other words:  Do not spend energy on erasing a mistake^Wlearing expirience.

I do understand this, but now someone cloning the repository and
running "gbp buildpackage" won't be able to do so.

I realise there is a much more minor change that would fix things:
just delete the upstream/5.3.0 tag, create the upstream/5.3.0
pristine-tar and upstream branch contents manually, make a commit with
those and recreate the upstream/5.3.0 tag to point to those.

Would that be acceptable?

Best wishes,

   Julian



Re: Messed up a salsa commit - how best to fix?

2022-04-24 Thread Julian Gilbey
Hi Timo,

On Sun, Apr 24, 2022 at 10:42:51PM +0200, Timo Röhling wrote:
> Hi Julian,
> 
> * Julian Gilbey  [2022-04-24 21:01]:
> > Somehow I managed to really mess up a commit to python-qtconsole: the
> > upstream and pristine-tar branches do not have the upstream/5.3.0
> > sources (the current ones).  However, there's already an
> > upstream/5.3.0 tag in the repository, pointing to a commit to the
> > master branch.
> > 
> > I think the simplest thing to do is to "rewrite history": delete the
> > head commits to the master branch and the 5.3.0 tags, and then
> > recommit correctly and force-push to salsa.
> > 
> > Would people be OK with me doing this, or do you have an alternative
> > suggestion?
> I looked at the Salsa repository, and it is not so bad. It seems like you
> forgot to pull the latest changes in upstream and pristine-tar from
> the 5.2.2 import first, so your import of 5.3.0 forked the those branches
> unintentionally.

Thanks for the analysis, but I don't think that's what happened.  I
think what I did was something like this:

* I had an up-to-date clone of the repository from salsa (at tag
  debian/5.2.2-1 on master, and at tag upstream/5.2.2 on upstream)
* I ran gbp import-orig on the new 5.3.0 sources.
* I realised I'd done something wrong, so I tried to reset --hard all
  three branches back to before the import.
* I then re-imported the 5.3.0 sources correctly, or so I thought.

(That can't be exactly right, but it may be close to what happened.)
But somehow, that didn't work properly, and what I've ended up with is
that the master branch is correct, with the upstream/5.3.0 and
debian/5.3.0-1 tags pointing at the correct commits, but the upstream
and pristine-tar branches are not up-to-date: the don't have the
upstream 5.3.0 contents.

So what I'm thinking I could do is:
* Unpack the 5.3.0 sources into the upstream branch and commit the
  change
* Reset the upstream/5.3.0 tag to point to this new commit
* Use gbp pristine-tar to create the .delta and .id files on the
  pristine-tar branch

This does not rewrite history, but it fixes the gbp problem, with the
only change required being a modification to the upstream/5.3.0 tag.

Thoughts?

   Julian



Re: Messed up a salsa commit - how best to fix?

2022-04-24 Thread Julian Gilbey
On Sun, Apr 24, 2022 at 10:49:55PM +0200, Timo Röhling wrote:
> * Timo Röhling  [2022-04-24 22:42]:
> > Make sure that your local upstream branch and the upstream/5.3.0 tag
> > both point at commit 08935221b549bf32157d739cd54eb1645a2ab123:
> Aaaand I copied the wrong commit hash to the email. :/
> e228e8902aeb91011a53bb1a91f7f3390a771e0e is the one you should be
> looking for:
> 
> https://salsa.debian.org/python-team/packages/python-qtconsole/-/commit/e228e8902aeb91011a53bb1a91f7f3390a771e0e

Indeed, that was the commit made by gbp import-orig.  But the upstream
and pristine-tar branches don't include that commit, unfortunately,
because of my mistake.  Therefore gbp buildpackage doesn't work,
because it can't find the original sources on the upstream branch.

Best wishes,

   Julian



Re: Messed up a salsa commit - how best to fix?

2022-04-24 Thread Julian Gilbey
On Sun, Apr 24, 2022 at 10:49:55PM +0200, Timo Röhling wrote:
> * Timo Röhling  [2022-04-24 22:42]:
> > Make sure that your local upstream branch and the upstream/5.3.0 tag
> > both point at commit 08935221b549bf32157d739cd54eb1645a2ab123:
> Aaaand I copied the wrong commit hash to the email. :/
> e228e8902aeb91011a53bb1a91f7f3390a771e0e is the one you should be
> looking for:
> 
> https://salsa.debian.org/python-team/packages/python-qtconsole/-/commit/e228e8902aeb91011a53bb1a91f7f3390a771e0e

Ah, I understand now!  This commit is actually the upstream sources!
I hadn't realised exactly what gbp does.  I'll have a go at following
your instructions in your last email tomorrow.

Thanks,

   Julian



Re: Messed up a salsa commit - how best to fix?

2022-04-26 Thread Julian Gilbey
Hi Timo (and Geert),

On Sun, Apr 24, 2022 at 10:42:51PM +0200, Timo Röhling wrote:
> Hi Julian,
> 
> * Julian Gilbey  [2022-04-24 21:01]:
> > Somehow I managed to really mess up a commit to python-qtconsole: the
> > upstream and pristine-tar branches do not have the upstream/5.3.0
> > sources (the current ones).  However, there's already an
> > upstream/5.3.0 tag in the repository, pointing to a commit to the
> > master branch.
> > 
> > I think the simplest thing to do is to "rewrite history": delete the
> > head commits to the master branch and the 5.3.0 tags, and then
> > recommit correctly and force-push to salsa.
> > 
> > Would people be OK with me doing this, or do you have an alternative
> > suggestion?
> I looked at the Salsa repository, and it is not so bad. It seems like you
> forgot to pull the latest changes in upstream and pristine-tar from
> the 5.2.2 import first, so your import of 5.3.0 forked the those branches
> unintentionally.
> [...]

Thanks for all the advice!  I managed to sort it moderately cleanly in
the end, and this email records what happened and what I did, in case
anyone might benefit from this in the future.

It turns out that I'd also messed up more than I'd realised: even when
I pulled in the updated master branch, I didn't pull the upstream
branch, so managed to introduce even more conflicts.  Oh well.

But the key things that allowed for a moderately clean fix were:

* I'd correctly used gbp import-orig to pull in the original 5.3.0
  distribution to the master branch

* I had an upstream/5.3.0 tag in my local repository (which for some
  reason I hadn't pushed, yeesh)

So the state of the salsa repository was (in an ideal world where I'd
pulled upstream):
- master included the upstream/5.3.0 commit, tagged as upstream/5.3.0,
  along with further commits
- upstream was at upstream/5.2.2
- pristine-tar contained data up to upstream/5.2.2

To fix the problem, I did:

$ git checkout upstream
$ git reset --hard upstream/5.3.0
$ git checkout master
$ gbp pristine-tar commit

and that fixed everything.  I finished with git push --all and git
push --tags.

I hope I don't make this mistake again!

Best wishes,

   Julian



Re: Messed up a salsa commit - Reporting the fix

2022-04-26 Thread Julian Gilbey
On Tue, Apr 26, 2022 at 01:26:26PM +0200, Geert Stappers wrote:
> [...]
> > So the state of the salsa repository was (in an ideal world where I'd
> > pulled upstream):
> > - master included the upstream/5.3.0 commit, tagged as upstream/5.3.0,
> >   along with further commits
> > - upstream was at upstream/5.2.2
> > - pristine-tar contained data up to upstream/5.2.2
> > 
> > To fix the problem, I did:
> > 
> > $ git checkout upstream
> > $ git reset --hard upstream/5.3.0
> 
> FWIW Here I miss a `git commit -a`

Actually there is no git commit here, surprisingly.  When running
`gbp import-orig`, if I understand correctly, it checks out the
previous upstream in the master branch, then unpacks the new upstream
and commits it, tagging it as "upstream/..." - this commit has a
single parent: the previous upstream commit.  Then it merges the
debian/ directory of the previous commit in the master branch with the
new upstream.  The "upstream" branch contains just the upstream
commits; "upstream" is just a pointer to the most recent upstream
commit, so the `git reset` command here moves the pointer to the
correct place.  There are no uncommitted files.

> Thanks for reporting, thanks for sharing what was learnt.

:-)

> > I hope I don't make this mistake again!
> 
> No worries, there will be other mistakes  and that is good.
> Know that the only way to avoid mistakes is doing nothing.

That is indeed true!

Best wishes,

   Julian



Re: Messed up a salsa commit - how best to fix?

2022-04-26 Thread Julian Gilbey
Hi Timo,

On Tue, Apr 26, 2022 at 01:55:13PM +0200, Timo Röhling wrote:
> Hi Julian,
> 
> * Julian Gilbey  [2022-04-26 11:03]:
> > It turns out that I'd also messed up more than I'd realised: even when
> > I pulled in the updated master branch, I didn't pull the upstream
> > branch, so managed to introduce even more conflicts.  Oh well.
> It's an easy mistake to write "git pull" if you meant to do "gbp
> pull". I lost count how often I wrote "git pq" by accident...

Ah, I didn't know about gbp pull/push!  I'm definitely going to use
those in future (and repeatedly make that same typing mistake!).

> > To fix the problem, I did:
> > 
> > $ git checkout upstream
> > $ git reset --hard upstream/5.3.0
> Judging from the current commit graph, you probably threw in a
> "git merge -s ours origin/upstream" here as well?

Yeah :-( Well, sort of.  I did the git reset, then git push --all and
got an error because I hadn't done a pull on this branch :-(  So then I
did a git pull on this branch followed by resolving the conflicts

> > $ git checkout master
> > $ gbp pristine-tar commit
> > 
> > and that fixed everything.  I finished with git push --all and git
> > push --tags.
> Nice!
> 
> > I hope I don't make this mistake again!
> Don't worry about it too much. Git is quite resilient, and as long
> as you do not panic and start force-pushing random stuff, everything
> can be repaired.

I'm not too worried, just that it took far more effort than it would
have done if I'd done things right to begin with!

Best wishes,

   Julian



Re: Move python-pytest-asyncio to debian-python-team shared repository?

2022-05-26 Thread Julian Gilbey
Hi Jonas,

On Fri, May 27, 2022 at 01:11:45AM +0200, Jonas Smedegaard wrote:
> Hi Julian and other Pythonistas,
> 
> Quoting Julian Gilbey (2022-05-27 00:44:22)
> > Currently python-pytest-asyncio is maintained just by you.  I wonder
> > whether you would be willing to move this to the Debian Python Team's
> > shared repository on salsa and transfer the maintainership to the
> > team, with you as primary uploader?
> 
> I would prefer to let go: Feel free to adopt pytest-asyncio and
> python-mock.
> 
> Kind regards,
> 
>  - Jonas

Firstly, a massive apology - I didn't check the version of
python-pytest-asyncio in unstable, and so hadn't noticed that you'd
updated it really recently.  Mea culpa.

Second, my suggestion of moving it to the team was not meant in any
way to say that I don't appreciate the amazing job you're doing.  In
fact, most packages in the Python Team repository are in reality
maintained by one person; the primary benefits of being within the
team are the mutual support and the ability to keep packages
up-to-date or fixed if there are some problems and the maintainer
can't get to them.  (And sometimes the Debian Janitor fixes issues in
packages automatically.)

So it's entirely up to you; I can't personally take on the package
(I'm already overloaded), but if you are happy to continue maintaining
it, would you consider migrating it to the team repository?

Best wishes,

   Julian



Move python-pytest-asyncio to debian-python-team shared repository?

2022-05-26 Thread Julian Gilbey
Dear Jonas,

I note that there is currently a serious bug against both
python-pytest-asyncio and pytest-mock
(https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1006736), and this
bug is preventing the migration of several major packages to testing.
I also note that the current Debian version of python-pytest-asyncio
is 0.16.0 while the upstream version is already 0.18.3; this may
(partly) address some of the issues.

Currently python-pytest-asyncio is maintained just by you.  I wonder
whether you would be willing to move this to the Debian Python Team's
shared repository on salsa and transfer the maintainership to the
team, with you as primary uploader?  (Most of the pytest-* packages
are already team-maintained.)  It's a very friendly team, and the
mailing list is quite low volume.  It would also mean that if there is
an issue like this that you don't have the time to deal with, someone
else from the team would be able to upload a new version of the
package for you.

The team policy is here:
https://salsa.debian.org/python-team/tools/python-modules/blob/master/policy.rst
and you can read more here: https://wiki.debian.org/Teams/PythonTeam/

Looking forward to hearing from you!

Best wishes,

   Julian



Updating pytest

2022-06-02 Thread Julian Gilbey
Hi all,

When I updated pytest-mock, I noticed that pytest is somewhat out of
date and it would be good to upgrade it.  But it's quite a major
package, and I don't really want to do it without a go-ahead from
others.

Perhaps we could upload a newer version to experimental first to see
what breaks?

Best wishes,

   Julian



Re: pdm-pep517: Shall we package it now?

2022-07-08 Thread Julian Gilbey
On Thu, Jul 07, 2022 at 03:36:30PM -0400, Boyuan Yang wrote:
> Hi,
> 
> 在 2022-06-28星期二的 11:19 -0400,Louis-Philippe Véronneau写道:
> > On 2022-06-28 09 h 24, Boyuan Yang wrote:
> > > Hi all,
> > > 
> > > I have encountered more and more packages that uses pdm-pep517 as build
> > > backend. Looking at [1], existing packages in Debian added patches to
> > > manually switch to other backends, such as Poetry.
> > > 
> > > I am wondering if it's time to package pdm-pep517 itself [2], or is
> > > there
> > > any blocking for it. I am aware that some sort of bootstrapping might be
> > > needed since pdm-pep517 seems to build-depends on itself. Besides that,
> > > what
> > > about packaging of pdm? Please correct me if needed: my mind and my
> > > packaging work is still stuck in the old times of setup.py, and I just
> > > started to look into the new ecosystem of pep517. Thanks!
> > > 
> > > Regards,
> > > Boyuan Yang
> > > 
> > > 
> > > [1] https://codesearch.debian.net/search?q=pdm.pep517
> > > [2] https://github.com/pdm-project/pdm-pep517
> > 
> > Once packaged, please ping me so I can update the
> > "missing-prerequisite-for-pyproject-backend" Lintian tag accordingly and
> > let people know they can migrate to it.
> 
> This is now accepted at https://tracker.debian.org/pkg/pdm-pep517 .
> 
> Cheers,
> Boyuan Yang

Brilliant!  Thanks for this!  (I think you'll need to do a source-only
upload now to allow migration to testing, unless the procedures have
changed recently.)

Best wishes,

   Julian



Re: archive rebuild for pytest from experimental

2022-07-04 Thread Julian Gilbey
On Fri, Jun 24, 2022 at 06:53:04PM -0400, Louis-Philippe Véronneau wrote:
> Thank you for your guidance.
> 
> I have filled all of the regressions you reported in the BTS:
> 
> https://bugs.debian.org/cgi-bin/pkgreport.cgi?tag=pytest7;users=debian-python@lists.debian.org

Thanks Louis-Philippe!

Best wishes,

   Julian



Strangely rare pytest 7.x bug report

2022-07-04 Thread Julian Gilbey
Dear all,

I wonder whether you might have any clue about
https://bugs.debian.org/1013700
I have mostly worked out the "cause" of the bug, but I haven't quite
got to the bottom of it.

When running the command
PYTHONPATH=. python3.10 -m pytest qtpy/tests
in the directory .pybuild/cpython3_3.10_qtpy/build, I get the error
message:

ImportError while loading conftest 
'/home/jdg/debian/spyder-packages/qtpy/build-area/python-qtpy-2.1.0/.pybuild/cpython3_3.10_qtpy/build/qtpy/tests/conftest.py'.
TypeError: the 'package' argument is required to perform a relative import for 
'.pybuild.cpython3_3.10_qtpy.build.qtpy.tests'

If the directory .pybuild is renamed to pybuild, the tests run without
a problem.  So there seems to be something funny about conftest.py
(and removing all of the other files from the qtpy/tests directory
except for the empty __init__.py gives the same error); here's a link
to it:

https://salsa.debian.org/python-team/packages/python-qtpy/-/blob/master/qtpy/tests/conftest.py

But there doesn't seem to be anything out of the ordinary about this.
So I am mystified: why does pytest 7.x seem to not give this error on
any other Debian package?

The only solution I currently have for this package is skip the tests
at build time and rely on autopkgtest to do them.

Best wishes,

   Julian



Re: Strangely rare pytest 7.x bug report

2022-07-04 Thread Julian Gilbey
Hi Carles,

It is utterly, utterly bizarre.  But I think I've found the problem.
There's a pytest.ini file in the package, but it's not copied into the
test directory.  So when pytest is run in the .pybuild directory, it
climbs all the way back up the directory tree to the python-qtpy-2.1.0
until it discovers the pytest.ini file there and uses that.  It sees
that we are requesting qtpy/tests, which it then expands into the
directory path .pybuild/cpython3_3.10_qtpy/build/qtpy/tests, starting
from the directory in which it found pytest.ini, and this causes the
breakage.

The solution is to copy the pytest.ini file into the .pybuild
directories by adding it to debian/pybuild.testfiles

Why this behaviour changed between pytest 6.x and pytest 7.x I don't
know; I don't see it obviously documented.  But that at least resolves
this problem.

Thanks for your help!

Best wishes,

   Julian

On Mon, Jul 04, 2022 at 08:39:32PM +0100, Carles Pina i Estany wrote:
> 
> Hi Julian,
> 
> On Jul/04/2022, Julian Gilbey wrote:
> > Hi Carles,
> > 
> > Thanks for your thoughts!  Yes, indeed that seems to be the issue.
> > But what I don't understand is why the import is turned into
> > .pybuild.cpython3_3.9_qtpy.build.qtpy.tests and not just qtpy.tests or
> 
> I see how pytest does it (but keep reading)
> 
> > a longer path, and why only this package fails in this way.  Perhaps
> > this is the only package that has an import statement in
> > pytest_configure?
> 
> This I don't know and I'm curious, and it might help disecting the issue
> (or understanding it). Do you know of any other python3 package that you
> expected to fail? (using pytest in a similar way).
> 
> I might try to get both and follow what they do different (to hopefully
> know what is python-qtpy doing different :-) )
> 
> I'm sure that there are tons of packages that use pytest :-) I'm
> wondering if you had a good candidate.
> 
> Best regards,
> 
> > 
> > Best wishes,
> > 
> >Julian
> > 
> > On Mon, Jul 04, 2022 at 04:03:39PM +0100, Carles Pina i Estany wrote:
> > > 
> > > Hi,
> > > 
> > > I'm a lurker of debian-python@lists.debian.org but seeing Python+Qt I
> > > wanted to have a look. I don't have a solution (I might look more
> > > another time if time permits) but I might have something that might help
> > > someone who knows the tools better.
> > > 
> > > I am not familiar with Python Debian packaging details/tools neither
> > > with pytest :-( so take all of this with a pinch of salt.
> > > 
> > > If it helps the error comes from:
> > > /usr/lib/python3.9/importlib/__init__.py in the functin "import_modules"
> > > it does:
> > > """
> > > if name.startswith('.'):
> > > if not package:
> > > msg = ("the 'package' argument is required to perform a 
> > > relative "
> > >"import for {!r}")
> > > raise TypeError(msg.format(name))
> > > """
> > > 
> > > When the import fails the "name" parameter of "import_modules" function
> > > is: '.pybuild.cpython3_3.9_qtpy.build.qtpy.tests' , which is derived
> > > from the hidden dirctory ".pybuild" as created by default by "pybuild".
> > > 
> > > I think that the initial "." is used only as a directory name but Python
> > > assumes that is a relative import requiring the package parameter.
> > > 
> > > Just to check my thoughts, and after running dpkg-buildpackage and
> > > failing let's try again:
> > > 
> > > $ cd .pybuild/cpython3_3.9_qtpy/build; python3.9 -m pytest qtpy/tests ; 
> > > cd -
> > > Fails with the:
> > > 
> > > TypeError: the 'package' argument is required to perform a relative 
> > > import for '.pybuild.cpython3_3.9_qtpy.build.qtpy.tests'
> > > /home/carles/git/python-qtpy
> > > 
> > > Then let's try to avoid the initial "." confusion:
> > > 
> > > $ mv .pybuild pybuild
> > > $ cd pybuild/cpython3_3.9_qtpy/build; python3.9 -m pytest qtpy/tests ; cd 
> > > -
> > > 
> > > It works.
> > > 
> > > I don't know why this is the only package affected by this though...
> > > 
> > > Hopefully it helps a bit!
> > > 
> > > On Jul/04/2022, Julian Gilbey wrote:
> > > > Dear all,
> > > > 
> > > > I wonder whether you might have any clue about
> > > > https://bugs.

Re: Strangely rare pytest 7.x bug report

2022-07-04 Thread Julian Gilbey
Hi Carles,

Thanks for your thoughts!  Yes, indeed that seems to be the issue.
But what I don't understand is why the import is turned into
.pybuild.cpython3_3.9_qtpy.build.qtpy.tests and not just qtpy.tests or
a longer path, and why only this package fails in this way.  Perhaps
this is the only package that has an import statement in
pytest_configure?

Best wishes,

   Julian

On Mon, Jul 04, 2022 at 04:03:39PM +0100, Carles Pina i Estany wrote:
> 
> Hi,
> 
> I'm a lurker of debian-python@lists.debian.org but seeing Python+Qt I
> wanted to have a look. I don't have a solution (I might look more
> another time if time permits) but I might have something that might help
> someone who knows the tools better.
> 
> I am not familiar with Python Debian packaging details/tools neither
> with pytest :-( so take all of this with a pinch of salt.
> 
> If it helps the error comes from:
> /usr/lib/python3.9/importlib/__init__.py in the functin "import_modules"
> it does:
> """
> if name.startswith('.'):
> if not package:
> msg = ("the 'package' argument is required to perform a relative "
>"import for {!r}")
> raise TypeError(msg.format(name))
> """
> 
> When the import fails the "name" parameter of "import_modules" function
> is: '.pybuild.cpython3_3.9_qtpy.build.qtpy.tests' , which is derived
> from the hidden dirctory ".pybuild" as created by default by "pybuild".
> 
> I think that the initial "." is used only as a directory name but Python
> assumes that is a relative import requiring the package parameter.
> 
> Just to check my thoughts, and after running dpkg-buildpackage and
> failing let's try again:
> 
> $ cd .pybuild/cpython3_3.9_qtpy/build; python3.9 -m pytest qtpy/tests ; cd -
> Fails with the:
> 
> TypeError: the 'package' argument is required to perform a relative import 
> for '.pybuild.cpython3_3.9_qtpy.build.qtpy.tests'
> /home/carles/git/python-qtpy
> 
> Then let's try to avoid the initial "." confusion:
> 
> $ mv .pybuild pybuild
> $ cd pybuild/cpython3_3.9_qtpy/build; python3.9 -m pytest qtpy/tests ; cd -
> 
> It works.
> 
> I don't know why this is the only package affected by this though...
> 
> Hopefully it helps a bit!
> 
> On Jul/04/2022, Julian Gilbey wrote:
> > Dear all,
> > 
> > I wonder whether you might have any clue about
> > https://bugs.debian.org/1013700
> > I have mostly worked out the "cause" of the bug, but I haven't quite
> > got to the bottom of it.
> > 
> > When running the command
> > PYTHONPATH=. python3.10 -m pytest qtpy/tests
> > in the directory .pybuild/cpython3_3.10_qtpy/build, I get the error
> > message:
> > 
> > ImportError while loading conftest 
> > '/home/jdg/debian/spyder-packages/qtpy/build-area/python-qtpy-2.1.0/.pybuild/cpython3_3.10_qtpy/build/qtpy/tests/conftest.py'.
> > TypeError: the 'package' argument is required to perform a relative import 
> > for '.pybuild.cpython3_3.10_qtpy.build.qtpy.tests'
> > 
> > If the directory .pybuild is renamed to pybuild, the tests run without
> > a problem.  So there seems to be something funny about conftest.py
> > (and removing all of the other files from the qtpy/tests directory
> > except for the empty __init__.py gives the same error); here's a link
> > to it:
> > 
> > https://salsa.debian.org/python-team/packages/python-qtpy/-/blob/master/qtpy/tests/conftest.py
> > 
> > But there doesn't seem to be anything out of the ordinary about this.
> > So I am mystified: why does pytest 7.x seem to not give this error on
> > any other Debian package?
> > 
> > The only solution I currently have for this package is skip the tests
> > at build time and rely on autopkgtest to do them.
> > 
> > Best wishes,
> > 
> >Julian



Re: archive rebuild for pytest from experimental

2022-07-11 Thread Julian Gilbey
Hi,

On Fri, Jul 08, 2022 at 09:33:10PM +0200, Carsten Schoenert wrote:
> Hi,
> 
> Am 16.06.22 um 10:05 schrieb Julian Gilbey:
> ...
> > Great, thanks.  Since the pygments in testing fails on pytest 7.2.1,
> > and the version in experimental depends on pytest >= 7.0, we'll need
> > to do the following when we are ready to upload pytest 7.2.1 to
> > unstable:
> > 
> > * Mark pytest 7.2.1 as Breaks: pygments (<< 2.12.0)
> 
> looking at the autopkgtest on the CI pygments is still failing while testing
> with versions from testing. Mostly because I think as pytest < 7 (from
> testing) is used. All other versions differences are not critical I guess.
> The following list of used packages are given by the different versions
> between testing and unstable.

Yes, indeed; as pygments depends on pytest >= 7 and the version
currently in testing breaks with pytest 7, pytest must declare Breaks:
python3-pygments (<< 2.12.0).  [Explanation: britney (?) tries each
package on its own when testing for migration: pytest can't migrate as
that would break pygments in testing, and pygments can't migrate as it
depends on the newer pytest.  So we're stuck and neither will
migrate.  But if pytest 7 declares a Breaks on pygments (<< 2.12.0),
then it will attempt to migrate both of them simultaneously.]

Best wishes,

   Julian



Re: Bug#1013425: ITP: wnpp -- Python Airspeed is a powerful templating engine compatible with Velocity for Java

2022-06-24 Thread Julian Gilbey
On Thu, Jun 23, 2022 at 02:00:21PM +0200, Felix Moessbauer wrote:
> Package: wnpp
> Severity: wishlist
> Owner: Felix Moessbauer 
> X-Debbugs-Cc: debian-de...@lists.debian.org, debian-python@lists.debian.org
> 
> * Package name: wnpp
>   Version : 0.5.19
> [...]
> * URL : https://github.com/purcell/airspeed
> [...]
>   Description : Airspeed is a powerful and easy-to-use templating engine 
> for Python that aims for a high level of compatibility with the popular 
> Velocity library for Java.

Hi Felix,

I'm not sure why you'd want to call this package "wnpp" rather than
"airspeed" or "python-airspeed".

Please could you change the title of this wnpp!

Best wishes,

   Julian



Re: Updating pytest

2022-06-08 Thread Julian Gilbey
On Tue, Jun 07, 2022 at 03:49:37PM -0400, Sandro Tosi wrote:
> > Sandro: you managed the numpy transition, it seems.  What is involved
> > in something like this?  I would imagine something like:
> >
> > (1) Upload pytest 7.x to experimental
> 
> i took care of this just now, uploading pytest/7.1.2 to experimental
> (and i've messed up the branches on salsa, so i've committed my
> changes to `experimental2`)

I've just looked at
https://release.debian.org/britney/pseudo-excuses-experimental.html -
it's encouraging that there have only been a handful of failures on
amd64.  There have been a few more on arm64, but they seem mostly
insignificant or transient.  Here are the failing tests:

1. finalcif: it seems like an update to python3-gemmi has broken this
package

2. fpylll: The failures are happening in the current unstable version
as well.

3. glyphslib: The failures are happening in the current unstable
version as well.

4. junitparser: This looks like it might be due to the new version of
pytest.

5. jupyter-client: The arm64 failure looks likely to be transient

6. monitoring-plugins-systemd: This looks like it might be due to the
new version of pytest - it's not finding any tests.

7. nibabel: This is a new warning I haven't seen before: there's a
ResourceWarning about an unclosed file, which is putting a message on
stderr and causing the test to fail.

8. pytest-pylint: This looks like it may be due to the new version of
pytest, but I'm not sure.

9. pytest-twisted: Bizarre; it's failing to find the testdir fixture;
pytest 7.x does not claim to have deprecated this, so something is
weird here.

10. python-ase: not sure why the pytest.warns call raises an error
rather than just a warning.  But this is certainly a pytest 7.x issue.
I'm not sure how best to rewrite the code in these tests; different
calls are now needed in the two cases (pytest.warns(...) in the first
case, warnings.catch_warnings() in the second, perhaps?).

11. python-async-lru: this throws an error to stderr:
 Future exception was never retrieved
 future: 
 ZeroDivisionError
No idea what that's about.

12. python-b2sdk: this fails to select any tests, so presumably a
pytest 7.x issue.

13. python-email-validator: interesting error on arm64, suggesting the
test needs fixing (but not pytest 7.x related)

14. python-etelemetry: looks like it is a pytest 7.x issue (change in
keywords for skipping?)

15. python-httplib2: similar to python-etelemetry

16. python-parameterized: this looks likely to be a pytest 7.x issue;
the meaning of missing_tests has perhaps changed?  Would need to look
at the code more carefully to understand this.

17. python-pytest-subtests: this is a pytest 7.x issue; the package
has been updated upstream

And that's it, so it really doesn't seem too bad.

Best wishes,

   Julian



Re: Updating pytest

2022-06-08 Thread Julian Gilbey
On Tue, Jun 07, 2022 at 03:49:37PM -0400, Sandro Tosi wrote:
> > Sandro: you managed the numpy transition, it seems.  What is involved
> > in something like this?  I would imagine something like:
> >
> > (1) Upload pytest 7.x to experimental
> 
> i took care of this just now, uploading pytest/7.1.2 to experimental
> (and i've messed up the branches on salsa, so i've committed my
> changes to `experimental2`)

Amazing, thanks!  I might have used a version number such as -1~exp1,
but that is fairly immaterial.  And at least I'm not the only one to
mess up things on salsa! ;-)

> > (2) Arrange with Lucas to test build all rdeps against the
> > experimental version of pytest (by which I mean: all packages which
> > require python3-pytest as a (recursive) build-dependency)
> 
> I'll take care of this soon, likely after pytest has been built on a
> buildd host (so will be either later today EST or tomorrow)

Great!

> > (3) File bugs (with patches where possible) against all packages which
> > either FTBFS with the experimental pytest or which fail their
> > autopkgtest suite with the experimental pytest.  Presumably these bugs
> > would have a usertag associated with them so they can be easily
> > monitored.
> 
> that's something usually Lucas can automate, but he'll provided a set
> of failed/successful logs for us to look at.

OK, that's really helpful.

> > (4) After an appropriate time period, prepare NMUs for remaining bugs.
> >
> > (5) Once all bugs are closed, upload to unstable.
> >
> > I could certainly do (1) and help with (3)-(5) if someone else can do
> > (2) and help with (3)-(5).
> >
> > Best wishes,
> >
> >Julian

Best wishes,

   Julian



Build and run-time triplets

2022-06-08 Thread Julian Gilbey
I'd like to ask for some help.  I'm working on packaging pydevd, which
builds a private .so library.  Ordinary extensions built using cython
or similar end up being called "foo.cpython-310-x86_64-linux-gnu.so",
but this library, which is not dependent on the Python version, should
presumably be called "bar.x86_64-linux-gnu.so".

Question 1: How do I determine (within Python) the triplet to use when
building the library?

Question 2: How do I determine (within Python) the triplet to use when
loading the library at runtime?

Thanks!

   Julian



Re: a review of your bumblebee-status package

2022-06-07 Thread Julian Gilbey
Hi Ben,

On Mon, Jun 06, 2022 at 10:42:53PM -0400, Ben Westover wrote:
> > > _version.py is not a copy of versioneer, it's *generated* by versioneer.
> > > However, there is versioneer.py in the root directory, which is. I'll
> > > exclude that from the source and repack.
> > 
> > hmm... how about that generated file though? shouldn't it be ... well,
> > generated at build time instead? :)
> 
> As far as I understand it, this file is used by the author of the program,
> not end users. I don't understand it well, though, because I haven't put
> much time into researching what versioneer even does.
> If my hunch is correct, I may be able to just remove the file from the
> source altogether, but I haven't tried that yet.

As far as I understand, versioneer (or the _version.py generated by
it) uses a whole bunch of heuristics to determine the version number
of the package, for example by looking at git tags and so on.  Several
times, I have found that _version.py in the PyPI release of a package
is a very small file (just a few lines long) stating the version
number of the package, while the _version.py in GitHub is huge and
doesn't work on a standalone packaged version.  If I recall correctly,
In more than one package I (co-)maintain, I've gone for the PyPI
version of this file.

Hope this helps!

   Julian



Re: Updating pytest

2022-06-06 Thread Julian Gilbey
On Sat, Jun 04, 2022 at 10:29:53AM +0800, Paul Wise wrote:
> On Fri, 2022-06-03 at 19:08 +0100, Julian Gilbey wrote:
> 
> > I believe that ci.debian.net checks packages against packages in
> > experimental (see
> > https://ci.debian.net/packages/s/spyder/unstable/amd64/ for example),
> > so it may be that the work is already done for us; what I don't know,
> > though, is how to extract this information for the package in
> > experimental.  Perhaps worth asking the debian-ci people.
> 
> I think this page includes debci results for experimental:
> 
> https://release.debian.org/britney/pseudo-excuses-experimental.html
> 
> It shows what would happen when migrating experimental to unstable.

Oh wow, thanks!  That's perfect.  So we can upload the new pytest to
experimental and see what happens...

Anyone willing to go for it?

Best wishes,

   Julian



Re: Updating pytest

2022-06-07 Thread Julian Gilbey
On Mon, Jun 06, 2022 at 09:01:37PM -0400, Sandro Tosi wrote:
> > > I think this page includes debci results for experimental:
> > >
> > > https://release.debian.org/britney/pseudo-excuses-experimental.html
> > >
> > > It shows what would happen when migrating experimental to unstable.
> >
> > Oh wow, thanks!  That's perfect.  So we can upload the new pytest to
> > experimental and see what happens...
> 
> please be aware that that is a very partial view, in particular only
> showing reverse dependencies with (meaningful) autopkgtests, which
> means there could be hidden gigantic breakages not detected by that
> page which will wreak havoc in unstable.
> 
> I would consider pytest a "core" python package, and so a complete
> rdeps rebuild is appropriate, i suggest having a look at
> https://salsa.debian.org/lucas/collab-qa-tools/-/blob/master/modes/numpy-exp
> and then contacting Lucas to get access to the AWS rebuild machinery.

Hi Sandro,

Ah, that's a very good point.  I was half-aware of it, but didn't
really think it through.

> > Anyone willing to go for it?
> 
> I thought you were volunteering for it? :) jokes aside, i think
> preparing the new pytest upstream release for experimental may be the
> "easiest" part of this ordeal.

I guess it will depend on how much breakage it causes; the
deprecations and breaking changes listed at
https://docs.pytest.org/en/stable/changelog.html see somewhat
"obscure", but who knows how much they're used in Debian packages?!

I wish I could help, but I only have a couple of hours per week for
Debian stuff, so this is beyond my capacity, unfortunately :-(

Best wishes,

   Julian



Re: a review of your bumblebee-status package

2022-06-07 Thread Julian Gilbey
On Tue, Jun 07, 2022 at 10:51:08AM -0400, Antoine Beaupré wrote:
> It seems to me that generated files shouldn't be shipped as part of the
> source we distributed to users. Those files should be (re)generated at
> build time.

Ah, I understand you better now.  Debian is full of generated files
distributed in source packages.  Most of them could be regenerated at
build time, but it's often not done.  (If they cannot be regenerated
from source, then there's a DFSG issue.)  A search of the mailing list
archives would probably find many discussions about this; I'm not sure
what the current consensus is.

Best wishes,

   Julian



Re: a review of your bumblebee-status package

2022-06-07 Thread Julian Gilbey
On Tue, Jun 07, 2022 at 03:59:22PM +0100, Julian Gilbey wrote:
> > > As far as I understand it, this file is used by the author of the 
> > > program, not end users. I don't understand it well, though, because I 
> > > haven't put much time into researching what versioneer even does.
> > > If my hunch is correct, I may be able to just remove the file from the 
> > > source altogether, but I haven't tried that yet.
> > 
> > Well, it's used by the program to show the version info, so it's
> > *eventually* used by users for sure.
> > 
> > I think just removing the file is a good first guess.
> 
> Removing the file is a bad idea!  Replacing it with the PyPI version
> (through debian/patches) is a much better approach.  (And possibly
> excluding the file in debian/copyright as well.)

Ah, scrap my comment - listen to Timo instead!

Best wishes,

   Julian



Re: a review of your bumblebee-status package

2022-06-07 Thread Julian Gilbey
On Tue, Jun 07, 2022 at 09:47:14AM -0400, Antoine Beaupré wrote:
> >>> _version.py is not a copy of versioneer, it's *generated* by versioneer.
> >>> However, there is versioneer.py in the root directory, which is. I'll
> >>> exclude that from the source and repack.
> >> 
> >> hmm... how about that generated file though? shouldn't it be ... well,
> >> generated at build time instead? :)
> >
> > As far as I understand it, this file is used by the author of the 
> > program, not end users. I don't understand it well, though, because I 
> > haven't put much time into researching what versioneer even does.
> > If my hunch is correct, I may be able to just remove the file from the 
> > source altogether, but I haven't tried that yet.
> 
> Well, it's used by the program to show the version info, so it's
> *eventually* used by users for sure.
> 
> I think just removing the file is a good first guess.

Removing the file is a bad idea!  Replacing it with the PyPI version
(through debian/patches) is a much better approach.  (And possibly
excluding the file in debian/copyright as well.)

   Julian



Re: a review of your bumblebee-status package

2022-06-07 Thread Julian Gilbey
On Tue, Jun 07, 2022 at 09:47:33AM -0400, Antoine Beaupré wrote:
> On 2022-06-07 07:11:15, Julian Gilbey wrote:
> > [...]
> > As far as I understand, versioneer (or the _version.py generated by
> > it) uses a whole bunch of heuristics to determine the version number
> > of the package, for example by looking at git tags and so on.  Several
> > times, I have found that _version.py in the PyPI release of a package
> > is a very small file (just a few lines long) stating the version
> > number of the package, while the _version.py in GitHub is huge and
> > doesn't work on a standalone packaged version.  If I recall correctly,
> > In more than one package I (co-)maintain, I've gone for the PyPI
> > version of this file.
> 
> Uh! So you just keep the file around altogether? That seems like a break
> of policy...

Maybe I wasn't clear: we pack the GitHub as the .orig.tar.gz and then
apply a patch (in debian/patches) to replace _version.py with the
version found in the PyPI package.  (There are often good reasons to
prefer the GitHub version over the PyPI version.)  I just looked
through my local packages and the only examples I could find were the
now-removed python-language-server and python-jsonrpc-server.

I'm not sure how this would break policy.

Best wishes,

   Julian



Re: Updating pytest

2022-06-07 Thread Julian Gilbey
On Tue, Jun 07, 2022 at 08:27:38AM +0100, Julian Gilbey wrote:
> > > Anyone willing to go for it?
> > 
> > I thought you were volunteering for it? :) jokes aside, i think
> > preparing the new pytest upstream release for experimental may be the
> > "easiest" part of this ordeal.
> 
> I guess it will depend on how much breakage it causes; the
> deprecations and breaking changes listed at
> https://docs.pytest.org/en/stable/changelog.html see somewhat
> "obscure", but who knows how much they're used in Debian packages?!
> 
> I wish I could help, but I only have a couple of hours per week for
> Debian stuff, so this is beyond my capacity, unfortunately :-(

Actually, on reflection, with the potential size of this, it is
probably worth collaborating on; I could certainly participate.

Sandro: you managed the numpy transition, it seems.  What is involved
in something like this?  I would imagine something like:

(1) Upload pytest 7.x to experimental

(2) Arrange with Lucas to test build all rdeps against the
experimental version of pytest (by which I mean: all packages which
require python3-pytest as a (recursive) build-dependency)

(3) File bugs (with patches where possible) against all packages which
either FTBFS with the experimental pytest or which fail their
autopkgtest suite with the experimental pytest.  Presumably these bugs
would have a usertag associated with them so they can be easily
monitored.

(4) After an appropriate time period, prepare NMUs for remaining bugs.

(5) Once all bugs are closed, upload to unstable.

I could certainly do (1) and help with (3)-(5) if someone else can do
(2) and help with (3)-(5).

Best wishes,

   Julian



Re: archive rebuild for pytest from experimental

2022-06-16 Thread Julian Gilbey
On Wed, Jun 15, 2022 at 09:30:56PM +0200, Carsten Schoenert wrote:
> Hi,
> [...]
> > 
> > * monitoring-plugins-systemd 2.3.1-2
> 
> I've updated sentry-python last week to the current upstream version, so
> this package can be count as fixed.
> 
> Current pygments requires pytest >= 7.0, I've uploaded 2.12.0 to
> experimental.

Hi Carsten,

Great, thanks.  Since the pygments in testing fails on pytest 7.2.1,
and the version in experimental depends on pytest >= 7.0, we'll need
to do the following when we are ready to upload pytest 7.2.1 to
unstable:

* Mark pytest 7.2.1 as Breaks: pygments (<< 2.12.0)
* Upload pygments 2.12.0 to unstable

Otherwise the pygments in testing will prevent the migration of
pytest.

Best wishes,

   Julian



Re: Build and run-time triplets

2022-06-09 Thread Julian Gilbey
On Thu, Jun 09, 2022 at 11:23:26AM +0500, Andrey Rahmatullin wrote:
> On Wed, Jun 08, 2022 at 10:43:57PM +0100, Julian Gilbey wrote:
> > I'd like to ask for some help.  I'm working on packaging pydevd, which
> > builds a private .so library.  Ordinary extensions built using cython
> > or similar end up being called "foo.cpython-310-x86_64-linux-gnu.so",
> > but this library, which is not dependent on the Python version, should
> > presumably be called "bar.x86_64-linux-gnu.so".
> If it's just a private library and not a Python module it should be called
> bar.so.
> 
> > Question 1: How do I determine (within Python) the triplet to use when
> > building the library?
> You don't.
> 
> > Question 2: How do I determine (within Python) the triplet to use when
> > loading the library at runtime?
> You don't, but also how are you actually loading it?

Well, the upstream wanted to compile two versions of the library, one
for 64 bit architectures and one for 32 bit architectures.  I don't
really want to build two different arch libraries in a single build,
because that seems very contrary to the way the Debian architectures
work, and would also limit it to the amd64/i386 architectures for no
obviously good reason.  I had imagined that if there is some sort of
multiarch setup, one might have the amd64 and i386 packages installed
simultaneously, hence the need for different names.  But I've never
done that myself, so I've no idea if it's even meaningful to do this.

The library is loaded into gdb using this code:

cmd.extend([
"--eval-command='call (void*)dlopen(\"%s\", 2)'" % target_dll,
"--eval-command='sharedlibrary %s'" % target_dll_name,
"--eval-command='call (int)DoAttach(%s, \"%s\", %s)'" % (
is_debug, python_code, show_debug_info)
])

where cmd is a list containing the gdb call, and target_dll and
target_dll_name point to the shared library.

Best wishes,

   Julian



Re: Updating pytest

2022-06-03 Thread Julian Gilbey
On Thu, Jun 02, 2022 at 05:28:36PM +0200, julien.pu...@gmail.com wrote:
> Le jeudi 02 juin 2022 à 10:28 -0400, Sandro Tosi a écrit :
> > > I would suggest ratt-rebuilding all reverse dependencies. Could
> > > that be
> > > done?
> > 
> > there order of thousands rdeps, i dont think it's fair to ask any
> > individual contributor the time and resources to check that via ratt.
> 
> Agreed. If the list of packages to check can't be handled by my modest
> setup during the night, I won't check them all.
> [...]

I believe that ci.debian.net checks packages against packages in
experimental (see
https://ci.debian.net/packages/s/spyder/unstable/amd64/ for example),
so it may be that the work is already done for us; what I don't know,
though, is how to extract this information for the package in
experimental.  Perhaps worth asking the debian-ci people.

Best wishes,

   Julian



Re: Build and run-time triplets

2022-06-09 Thread Julian Gilbey
On Thu, Jun 09, 2022 at 03:00:24PM +0500, Andrey Rahmatullin wrote:
> > The build system here is the standard Python setup.py, except for this
> > library.  That is built by the following script:
> > 
> > ---
> > g++ -m64 -shared -o attach_linux_amd64.so -fPIC -nostartfiles attach.cpp
> > mv attach_linux_amd64.so ../attach_linux_amd64.so
> > echo Compiled amd64
> > ---
> > 
> > There's not even an attempt at working out ${libdir} or so on.
> Sure, it's not necessary to know the installation path at the compile and
> link time. It's only needed at the install time.

Ah, to clarify: this script compiles the library and sticks it into
the parent directory.  It is then installed with the rest of the
Python module via setup.py into /usr/lib/python3/dist-packages/...
So if I want to install it somewhere else, I will have to handle that
manually.

Best wishes,

   Julian



Re: Build and run-time triplets

2022-06-09 Thread Julian Gilbey
On Thu, Jun 09, 2022 at 11:00:28AM +0100, Simon McVittie wrote:
> On Thu, 09 Jun 2022 at 09:56:42 +0100, Julian Gilbey wrote:
> > OK (and yes, it does require the full path at runtime).  What triplet
> > do I use in d/rules?  dpkg-architecture offers 6 different ones:
> > DEB_{BUILD,HOST,TARGET}_{GNU_TYPE,MULTIARCH}?  I'm guessing
> > DEB_TARGET_MULTIARCH, but I'm really not certain, so it would be great
> > to confirm that.
> 
> You'd want DEB_HOST_MULTIARCH here (or use ${LIB} as I mentioned in a
> previous message to this thread).
> [...]

Thanks for this great explanation - I'm learning so much in this
thread!

> > About the location, though: why do compiled Python libraries live in
> > /usr/lib/python3/dist-packages/ and not
> > /usr/lib//?
> 
> The Python jargon for a native C/C++ library that can be loaded to
> provide a Python module is an *extension*.
> [...]

> > And is there a good reason not to do
> > the same with this Python-package-specific library?
> 
> If it isn't a Python extension (cannot be loaded into Python with
> "import foo" to provide a module API to Python code) then it would seem
> inappropriate to put it in the directory that is reserved for Python
> extensions.

I hear this argument, though we do find data files and other sorts of
things other than Python files and compiled extensions that are needed
by Python modules in that same directory ("data_files" in setup.py);
for example numpy has dozens of other files in
/usr/lib/python3/dist-packages/numpy.

I think the counter-argument is exactly what you said: as long as the
shared library is in a place where the code that uses it knows where
to locate it, all will work fine.  The Python code as-written expects
the library to be in the same directory as the Python code.

Best wishes,

   Julian



Re: Build and run-time triplets

2022-06-09 Thread Julian Gilbey
On Thu, Jun 09, 2022 at 01:03:25PM +0500, Andrey Rahmatullin wrote:
> [...]
> > Well, the upstream wanted to compile two versions of the library, one
> > for 64 bit architectures and one for 32 bit architectures.  I don't
> > really want to build two different arch libraries in a single build,
> > because that seems very contrary to the way the Debian architectures
> > work, and would also limit it to the amd64/i386 architectures for no
> > obviously good reason.  I had imagined that if there is some sort of
> > multiarch setup, one might have the amd64 and i386 packages installed
> > simultaneously, hence the need for different names.
> The normal way for this is putting it into
> /usr/lib//pkgname/foo.so, and according to the code below you'll
> need the full path at the run time so you indeed need the triplet at both
> build and run time. You can get the triplet in d/rules, not sure how
> should you pass it to the build system as that depends on the build system
> used. For the run time, https://wiki.debian.org/Python/MultiArch suggests
> sys.implementation._multiarch (note that you cannot use it during the
> build as that would break cross-compilation etc.), not sure if there are
> better ways. 

Thanks for your help!

OK (and yes, it does require the full path at runtime).  What triplet
do I use in d/rules?  dpkg-architecture offers 6 different ones:
DEB_{BUILD,HOST,TARGET}_{GNU_TYPE,MULTIARCH}?  I'm guessing
DEB_TARGET_MULTIARCH, but I'm really not certain, so it would be great
to confirm that.

About the location, though: why do compiled Python libraries live in
/usr/lib/python3/dist-packages/ and not
/usr/lib//?  And is there a good reason not to do
the same with this Python-package-specific library?  It's not for
general use, so I can't see why I shouldn't put it in the python3
directory with the other compiled Python module libraries.

Best wishes,

   Julian



Re: Build and run-time triplets

2022-06-09 Thread Julian Gilbey
On Thu, Jun 09, 2022 at 10:26:13AM +0100, Simon McVittie wrote:
> On Thu, 09 Jun 2022 at 13:03:25 +0500, Andrey Rahmatullin wrote:
> > The normal way for this is putting it into
> > /usr/lib//pkgname/foo.so, and according to the code below you'll
> > need the full path at the run time so you indeed need the triplet at both
> > build and run time.
> 
> You can do something like
> 
>  handle = dlopen("/usr/${LIB}/pkgname/foo.so", flags);
> [...]
> 
> Then you'd install the private library into what Autotools would refer to
> as ${libdir}/pkgname/foo.so (adjust as necessary for other build systems)
> and it will usually end up in the correct place. This assumes that
> ${libdir} is configured to something like
> ${exec_prefix}/lib/x86_64-linux-gnu or ${exec_prefix}/lib64 as appropriate
> for the distribution, but that's normally true anyway, and in particular
> should be true in debhelper.

Thanks Simon!

The build system here is the standard Python setup.py, except for this
library.  That is built by the following script:

---
g++ -m64 -shared -o attach_linux_amd64.so -fPIC -nostartfiles attach.cpp
mv attach_linux_amd64.so ../attach_linux_amd64.so
echo Compiled amd64
---

There's not even an attempt at working out ${libdir} or so on.  It
seems like overkill to set up a whole Autotools environment for this
one file :-(

I'm still unsure why I shouldn't just put it in the Python package
area near the cython files such as
/usr/lib/python3/dist-packages/_pydevd_bundle/pydevd_cython.cpython-310-x86_64-linux-gnu.so

Best wishes,

   Julian



Re: Notes from the DC22 Python Team BoF

2022-07-25 Thread Julian Gilbey
On Sat, Jul 23, 2022 at 07:52:19PM +0200, Louis-Philippe Véronneau wrote:
> Hey folks,
> 
> We had a Python Team BoF at DC22 earlier today and I thought relaying the
> notes we took in gobby here would be a good idea.

Thanks for the notes, Louis-Philippe, and sorry I couldn't join you!

A few comments

> --
> == python3.11 ==
> 
> python3.11 release has been delayed, from october 2022 to december 2022.
> [...]

My 2 cents' worth is as the 3.9->3.10 transition took several months,
and was quite complicated, it is not wise to attempt the 3.10->3.11
before the freeze.  We could then potentially go straight to 3.12 a
few months after the bookworm freeze rather than going to 3.11 first.
And that will probably be quite painful.

> == pybuild improvements ==
> 
> getting the autopkgtest MR in would be great
> 
> https://salsa.debian.org/python-team/tools/dh-python/-/merge_requests/27
> 
> We need people to test this MR some more, although it seems fairly mature.
> 
> It might be a good idea to have a line in d/control to let us migrate from
> the existing autopkgtests running unit tests to the new automated MR.

I'll take this to a separate email.

> == lintian tags requests for the team ==
> 
> pollo will write you Python-related lintian tags. Ask him to.

https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1004746  :-)

   Julian



pybuild-autopkgtest (was: Notes from the DC22 Python Team BoF)

2022-07-25 Thread Julian Gilbey
On Sat, Jul 23, 2022 at 07:52:19PM +0200, Louis-Philippe Véronneau wrote:
> == pybuild improvements ==
> 
> getting the autopkgtest MR in would be great
> 
> https://salsa.debian.org/python-team/tools/dh-python/-/merge_requests/27
> 
> We need people to test this MR some more, although it seems fairly mature.
> 
> It might be a good idea to have a line in d/control to let us migrate from
> the existing autopkgtests running unit tests to the new automated MR.

I've been looking at this a bit more.  I'm not sure what this last
paragraph means: the new "automated" autopkgtest will only be run if
the maintainer explicitly adds:

Testsuite: autopkgtest-pkg-pybuild

to debian/control (see the autodep8 MR at
https://salsa.debian.org/ci-team/autodep8/-/merge_requests/27/diffs -
it will never automatically detect a pybuild package).
And a maintainer would presumably only add that if they are also
removing their existing debian/tests/control (or want to run the
pybuild tests in addition).

An alternative would be for the autodep8 patch to try to determine
whether to run pybuild-autopkgtest.  One approach could be:

if the package would run autopkgtest-pkg-python:
  if debian/control does not contain an override_dh_auto_test stanza:
run pybuild-autopkgtest

Note, though, that if autodep8 is called, it will run all of the
detected tests.  (At least that is what I believe happens from reading
/usr/bin/autodep8; I haven't double-checked this.)  So, for example,
if a package specifies

Testsuite: autopkgtest-pkg-python

it will also run the autopkgtest-pkg-pybuild suite as it will be
detected as being a Python package, and vice versa.  That is a
possible reason *not* to use the above suggestion, as it would
potentially run pybuild-autopkgtest even if not desired.

Best wishes,

   Julian



Re: Build and run-time triplets

2022-07-24 Thread Julian Gilbey
On Mon, Jul 25, 2022 at 12:41:16AM +0500, Andrey Rahmatullin wrote:
> On Sun, Jul 24, 2022 at 08:30:42PM +0100, Julian Gilbey wrote:
> [...]
> > > > are they all effectively Multi-Arch: no?  Is this worth thinking about
> > > > in the longer term?
> > > What do you propose?
> > 
> > I think the fix to bug #812228 might have done the job nicely ;-)
> If it actually ships extensions, the "it should usually get a dependency
> on the Python interpreter for the same architecture" part should still
> apply as far as I understand it.

Thanks Andrey!

OK.  So let's dissect this tag info and see where we're currently at.

  Explanation: This Multi-Arch: same package uses 
pycompile or
   py3compile in the specified maintainer script.
   .
   py{,3}compile are tools used to byte-compile Python source
   files. It is typically run on installation of Debian packages that ship
   Python modules. However, they do not support installing several
   architectures of the same package and this is not Multi-Arch: safe.

This is now out-of-date: firstly, we can presumably get rid of the
pycompile mention, as there are only a tiny handful of Python 2
packages still around, and we're trying to get rid of them.

Secondly, py3compile now supports installing several architectures of
the same package; see the closing changelog message on bug 812228:

 Architecture-qualify py*compile and py*clean calls in maintainer scripts,
 for architecture-specific Python packages. This allows co-installation
 (and even concurrent unpacking) of different architectures of a package.

So the rest of the paragraph is also out of date.

   If the contents of the package is not architecture dependent, it should
   usually be made binary-all.

That is still certainly true.

   If the contents of the package is architecture dependent, it should
   usually get a dependency on the Python interpreter for the same
   architecture. This is a dependency in the form of python3, not
   an architecture-qualified dependency such as python3:any (which
   can be fulfilled by the Python interpreter for any architecture).

This is interesting; dh-python gives the dependency:
   python3 (<< 3.11), python3 (>= 3~), python3:any
which has both same-architecture and qualified architecture
dependencies; obviously the same-architecture one "wins".  But this
paragraph is probably unnecessary for most dh-python-using packages
(though it doesn't seem to harm).

   If a dependency on the Python interpreter for the same architecture
   exists (usually generated by dh-python), the
   Multi-Arch: same has no effect and should be dropped.

Ah.  I see the point.  Because python3 and python3-minimal are
Multi-Arch: allowed, the different arches of python3 are not
co-installable, and so there is no point in labelling the
arch-dependent module packages as Multi-Arch: same; they still could
not be co-installed.

  See-Also: pycompile(1), py3compile(1), Bug#812228

This list can probably be trimmed down to py3compile.


I hope this reasoning is useful; shall I pass it on to the lintian
folk?

Best wishes,

   Julian



Re: pybuild-autopkgtest (was: Notes from the DC22 Python Team BoF)

2022-07-28 Thread Julian Gilbey
On Wed, Jul 27, 2022 at 09:32:19PM +0100, Julian Gilbey wrote:
> [...]
> > 
> > I'd be wary about 2.2 and 2.3.  I have several packages where I know
> > that an automated test will fail; there are all sorts of weird cases
> > [...]
> 
> I'd be wary about adding lintian tags for this, though: with so many
> packages not being able to use the autodep8 system (I vaguely recall
> someone suggesting that a third of Python packages would not be able
> [...]

I realise that I may have come across as quite negative.  Apologies if
that is the case - it was not my intention.  I think the
autopkgtest-pkg-pybuild/pybuild-autodep8 work is very helpful and a
positive addition to our infrastructure, and I'm very grateful to
those who've made it happen.

My only concern is with how it is introduced; because of the wide
variety of Python packages and the hugely varying nature of the
testsuites present in them (or not), I think that trying to force this
into packages is a poor idea.  (I wish that most of my packages could
simply use this pybuild-autodep8 tool.  I fear that this won't be the
case, but I will certainly adopt it for those which can.)

Best wishes,

   Julian



Re: pybuild-autopkgtest (was: Notes from the DC22 Python Team BoF)

2022-07-27 Thread Julian Gilbey
On Tue, Jul 26, 2022 at 11:50:19AM -0300, Antonio Terceiro wrote:
> I think the notes did not capture the consensus correctly. The point was
> that it should be possible to automate updating the `Testsuite:` field
> to run tests with pybuild-autopkgtest, and that we should probably do
> that across team packages with the help of some scripting.

This makes more sense, thanks.

There seems to be little point running both pybuild-autopkgtest and a
manually written debian/tests/* test suite.  So would the script only
add pybuild-autopkgtest to packages which don't have a manually
written debian/tests/* suite?

Best wishes,

   Julian



Re: pybuild-autopkgtest (was: Notes from the DC22 Python Team BoF)

2022-07-27 Thread Julian Gilbey
On Wed, Jul 27, 2022 at 10:26:33AM -0400, Louis-Philippe Véronneau wrote:
> The way I see it:
> 
> 1. We should have a Lintian tag for packages not using the new
> pybuild-autodep8 autopkgtest. It would be even better if this tag would be a
> pointed hint that identified 'manually' written unit test autopkgtests that
> could be replaced.
> 
> This way, you get something like:
> 
> python-foo source: not-using-pybuil-autodep8 [debian/tests/unittests]
> 
> for python packages that have old 'manually' written unit test autopkgtests
> and:
> 
> python-foo source: not-using-pybuild-autodep8 [no-autopkgtest]
> 
> for python packages without any autopkgtest.
> 
> 2. lintian-brush (or something else, but I think lintian-brush is the right
> tool) would go over these packages to:
> 
> 2.1 Add the new autodep8 autopkgtests and build the package to see if they
> pass
> 2.2 Remove the "manual" unit test autopkgtests if 2.1 succeeds
> 2.3 Open a bug report if 2.1 fails

I'd be wary about 2.2 and 2.3.  I have several packages where I know
that an automated test will fail; there are all sorts of weird cases
where I've had to write tests manually.  I would also be quite cross
if manually crafted tests were automatically removed, especially in
cases such as Simon mentioned where they do things that that
automatically generated test does not do.  Another thing I could
imagine happening is that the automated test succeeds in a trivial way
- it succeeds but doesn't actually test much because of the nature of
the package.

On the other hand, a bug report saying something like the following
seems much more reasonable: "We've tested this package using the
automated autopkgtest system and it seems to work by adding the line
'Testsuite: autopkgtest-pkg-pybuild'; please check that the automated
tests cover all of the tests of your manually written debian/tests/*
and if so, then please remove them.  The autopkgtest-pkg-pybuild logs
are attached."  This would give the maintainer the chance to decide
how best to proceed.

Best wishes,

   Julian



Re: pybuild-autopkgtest (was: Notes from the DC22 Python Team BoF)

2022-07-27 Thread Julian Gilbey
On Wed, Jul 27, 2022 at 07:45:12PM +0100, Julian Gilbey wrote:
> On Wed, Jul 27, 2022 at 10:26:33AM -0400, Louis-Philippe Véronneau wrote:
> > The way I see it:
> > 
> > 1. We should have a Lintian tag for packages not using the new
> > pybuild-autodep8 autopkgtest. It would be even better if this tag would be a
> > pointed hint that identified 'manually' written unit test autopkgtests that
> > could be replaced.
> > 
> > This way, you get something like:
> > 
> > python-foo source: not-using-pybuil-autodep8 [debian/tests/unittests]
> > 
> > for python packages that have old 'manually' written unit test autopkgtests
> > and:
> > 
> > python-foo source: not-using-pybuild-autodep8 [no-autopkgtest]
> > 
> > for python packages without any autopkgtest.
> > 
> > 2. lintian-brush (or something else, but I think lintian-brush is the right
> > tool) would go over these packages to:
> > 
> > 2.1 Add the new autodep8 autopkgtests and build the package to see if they
> > pass
> > 2.2 Remove the "manual" unit test autopkgtests if 2.1 succeeds
> > 2.3 Open a bug report if 2.1 fails
> 
> I'd be wary about 2.2 and 2.3.  I have several packages where I know
> that an automated test will fail; there are all sorts of weird cases
> where I've had to write tests manually.  I would also be quite cross
> if manually crafted tests were automatically removed, especially in
> cases such as Simon mentioned where they do things that that
> automatically generated test does not do.  Another thing I could
> imagine happening is that the automated test succeeds in a trivial way
> - it succeeds but doesn't actually test much because of the nature of
> the package.
> 
> On the other hand, a bug report saying something like the following
> seems much more reasonable: "We've tested this package using the
> automated autopkgtest system and it seems to work by adding the line
> 'Testsuite: autopkgtest-pkg-pybuild'; please check that the automated
> tests cover all of the tests of your manually written debian/tests/*
> and if so, then please remove them.  The autopkgtest-pkg-pybuild logs
> are attached."  This would give the maintainer the chance to decide
> how best to proceed.

Here's another alternative to steps 2.1-2.3 based on this:

 For packages which currently have manually-written autopkgtests:

 2.A Try removing debian/tests and adding Testsuite:
 autopkgtest-pkg-pybuild to debian/control, then building the
 package and running autopkgtest.

 2.B If this works, then submit a bug report to the BTS as I suggested
 above.

 2.C If this does not work, don't do anything more; trust that the
 maintainer knew what they were doing when they wrote the manual
 autopkgtests.

 For packages which don't currently have manually-written
 autopkgtests:

 2.A' Try adding Testsuite: autopkgtest-pkg-pybuild to debian/control,
  then building the package and running autopkgtest.

 2.B' If this works, then either Janitor adds this line to
  debian/control or submit a bug report to the BTS to recommend
  this.  (But we would not expect Janitor to do step 2.1', so this
  might have to be a different setup, or maybe the script doing
  2.A' could leave a list of packages for Janitor, or something
  like that.)

 2.C' If this does not work, submit a wishlist bug to the BTS to
  recommend that the maintainer adds some autopkgtest tests,
  either by making the autodep8 system work or by writing some
  manual tests.

I'd be wary about adding lintian tags for this, though: with so many
packages not being able to use the autodep8 system (I vaguely recall
someone suggesting that a third of Python packages would not be able
to use the system), that's a lot of packages getting false positives.
In particular, for the suggested first version of
not-using-pybuild-autodep8 tag (which would probably be better named
manual-autopkgtest-could-be-pybuild-autodep8), how would lintian go
about identifying which packages fall into category 2.B and which into
2.C?  The second version of the tag (better named something like
no-autopkgtest-could-use-pybuild-autodep8, but that's still not very
good) is less problematic.

Best wishes,

   Julian



Re: Build and run-time triplets

2022-07-24 Thread Julian Gilbey
On Sun, Jul 24, 2022 at 11:41:56PM +0500, Andrey Rahmatullin wrote:
> [...]
> > 
> > I got all of the triplets working, but was then stymied when I tried
> > to specify Multi-Arch: same in the control file.  I got a lintian
> > warning: multi-arch-same-package-calls-pycompile
> > It seems that since the pybuild system (via dh_python3) adds a
> > py3compile command to the postinst of the package, then I can't safely
> > use Multi-Arch: same.
> > 
> > I don't know if this is the case for all python3 Arch: any packages
> > with compiled extensions;
> I think the tag desciption has a good step-by-step explanation why does
> the tag exists.
> But your package is not a "package with compiled extensions", is it?

Yes, it's got compiled extensions (Cython) in addition to this
non-Python .so library file.

> > are they all effectively Multi-Arch: no?  Is this worth thinking about
> > in the longer term?
> What do you propose?

I think the fix to bug #812228 might have done the job nicely ;-)

Best wishes,

   Julian



Re: Build and run-time triplets

2022-07-24 Thread Julian Gilbey
On Sun, Jul 24, 2022 at 11:46:14PM +0500, Andrey Rahmatullin wrote:
> On Sun, Jul 24, 2022 at 06:36:57PM +0100, Julian Gilbey wrote:
> > I got all of the triplets working, but was then stymied when I tried
> > to specify Multi-Arch: same in the control file.  I got a lintian
> > warning: multi-arch-same-package-calls-pycompile
> > It seems that since the pybuild system (via dh_python3) adds a
> > py3compile command to the postinst of the package, then I can't safely
> > use Multi-Arch: same.
> Actually, #812228, mentioned in the tag description, was fixed in 2021 so
> it's possible that this is no longer a problem.

Ah, that's exciting thanks!  So maybe this lintian tag should be
dropped?

Best wishes,

   Julian



Re: Build and run-time triplets

2022-07-24 Thread Julian Gilbey
Well, here's an update on this old thread...

On Thu, Jun 09, 2022 at 01:03:25PM +0500, Andrey Rahmatullin wrote:
> On Thu, Jun 09, 2022 at 08:42:28AM +0100, Julian Gilbey wrote:
> > > > I'd like to ask for some help.  I'm working on packaging pydevd, which
> > > > builds a private .so library.  Ordinary extensions built using cython
> > > > or similar end up being called "foo.cpython-310-x86_64-linux-gnu.so",
> > > > but this library, which is not dependent on the Python version, should
> > > > presumably be called "bar.x86_64-linux-gnu.so".
> > > If it's just a private library and not a Python module it should be called
> > > bar.so.
> > > 
> > > > Question 1: How do I determine (within Python) the triplet to use when
> > > > building the library?
> > > You don't.
> > > 
> > > > Question 2: How do I determine (within Python) the triplet to use when
> > > > loading the library at runtime?
> > > You don't, but also how are you actually loading it?
> > 
> > Well, the upstream wanted to compile two versions of the library, one
> > for 64 bit architectures and one for 32 bit architectures.  I don't
> > really want to build two different arch libraries in a single build,
> > because that seems very contrary to the way the Debian architectures
> > work, and would also limit it to the amd64/i386 architectures for no
> > obviously good reason.  I had imagined that if there is some sort of
> > multiarch setup, one might have the amd64 and i386 packages installed
> > simultaneously, hence the need for different names.
> The normal way for this is putting it into
> /usr/lib//pkgname/foo.so, and according to the code below you'll
> need the full path at the run time so you indeed need the triplet at both
> build and run time. You can get the triplet in d/rules, not sure how
> should you pass it to the build system as that depends on the build system
> used. For the run time, https://wiki.debian.org/Python/MultiArch suggests
> sys.implementation._multiarch (note that you cannot use it during the
> build as that would break cross-compilation etc.), not sure if there are
> better ways. 

I got all of the triplets working, but was then stymied when I tried
to specify Multi-Arch: same in the control file.  I got a lintian
warning: multi-arch-same-package-calls-pycompile
It seems that since the pybuild system (via dh_python3) adds a
py3compile command to the postinst of the package, then I can't safely
use Multi-Arch: same.

I don't know if this is the case for all python3 Arch: any packages
with compiled extensions; are they all effectively Multi-Arch: no?  Is
this worth thinking about in the longer term?

Best wishes,

   Julian



Re: pdm-pep517: Shall we package it now?

2022-06-30 Thread Julian Gilbey
On Wed, Jun 29, 2022 at 10:58:48AM -0400, Boyuan Yang wrote:
> Hi,
> [...]
> Thanks. I have uploaded an initial version to the NEW queue. The packaging
> work is at https://salsa.debian.org/python-team/packages/pdm-pep517/ .
> 
> It is worth noting that I took the very aggressive way in stripping every
> vendored libraries, see
> https://salsa.debian.org/python-team/packages/pdm-pep517/-/tree/master/debian/patches
> . This may not be ideal since the patch will need human intervention for every
> new upstream release, but anyway let's have a working version in Sid first.
> Meanwhile, any review or other types of help would be appreciated.

Hi Boyuan,

I've just taken a look; the exclusions and patching look fine to me.
Yes, there will need to be some human intervention, but there always
should be for new upstream versions.  It may well be that all that is
needed is to refresh the patch and run:

quilt push -a
rgrep _vendor pdm

(or the equivalent gbp pq push or something like that) to check that
nothing has changed.

One small change to the patch:

* In pdm/pep517/utils.py and pdm/pep517/wheel.py, there is no need to
  insert an "import packaging"

Best wishes,

   Julian



Re: rstcheck: how should I package the new public module?

2022-04-18 Thread Julian Gilbey
On Mon, Apr 18, 2022 at 07:15:46PM +0200, Timo Röhling wrote:
> Hi,
> 
> rstcheck used to be a pure CLI application, so I packaged it as
> "rstcheck" with a private module in /usr/share/rstcheck. The latest
> version has added a public module interface, so "import rstcheck"
> has become a thing and the module needs to be installed publicly for
> that.
> 
> My question is: should I move/rename the whole package to
> python3-rstcheck (and keep rstcheck temporarily as transitional
> dummy package), or should I keep /usr/bin/rstcheck in the old
> package indefinitely and only move the module to python3-rstcheck?
> 
> The latter solution feels cleaner to me, because it neatly separates
> the library and the appliation, but at the cost of an almost empty
> binary package, which is frowned upon by the FTP team. Any
> suggestions how I should proceed?

Hi Tim,

We've done exactly that with spyder: spyder contains just the binary,
manpage, desktop file, appdata, icon and reportbug file.  This is also
helpful if there is ever a Python 4.x, as then there will be four
packages: rstcheck, python3-rstcheck and python4-rstcheck.

This information may be of use to you.

Best wishes,

   Julian



Re: Updating jupyter-core to 4.10.0

2022-05-06 Thread Julian Gilbey
On Thu, May 05, 2022 at 11:07:31PM +0200, julien.pu...@gmail.com wrote:
> Le jeudi 05 mai 2022 à 09:43 +0100, Julian Gilbey a écrit :
> > 
> > I've had similar problems in the past.  It usually comes down to the
> > build requiring some package that happens to be present on your
> > system but not listed in the Build-Depends field, so dpkg-
> > buildpackage works, but sbuild doesn't.
> 
> That's indeed the usual suspect ; but here it turns out it wanted a
> writable home directory.
> 
> I'll check it doesn't break everything then upload... probably
> tomorrow...
> 
> Thanks!
> 
> J.Puydt

Ah, of course, that's the other thing I've hit in recent times with
ipython-related packages.  Note that writing to the home directory at
build-time is a contravention of Policy; see the thread starting at
https://lists.debian.org/debian-devel/2022/04/msg00345.html where I
ask about pretty much the same thing, and a clear distinction is made
between build time and autopkgtest time.  You mentioned earlier that
it's the tests that are failing; are these build-time tests or
autopkgtest tests?  If they're build-time tests, you could either
disable them or create a temporary home directory.  If they're
autopkgtest tests, then it depends on your test environment; see that
thread for recommendations and
https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1010437 for a
discussion of how to set up lxc with autopkgtest.

Best wishes,

   Julian



Re: Updating jupyter-core to 4.10.0

2022-05-05 Thread Julian Gilbey
Hi Julien,

On Thu, May 05, 2022 at 09:51:28AM +0200, julien.pu...@gmail.com wrote:
> Hi,
> 
> I tried to update jupyter-core to 4.10.0, but didn't manage to run the
> upstream test suite reliably.
> 
> I first disabled it, so the package was building with both dpkg-
> buildpackage and sbuild. A bad solution.
> 
> Then I re-enabled it ; it passes with dpkg-buildpackage (there's a
> catch, see [1]), but not with sbuild, and I couldn't find out why.
> 
> If someone has a clue, I'm ready to learn new tricks...

I've had similar problems in the past.  It usually comes down to the
build requiring some package that happens to be present on your system
but not listed in the Build-Depends field, so dpkg-buildpackage works,
but sbuild doesn't.  One thing I've done (boring, but has worked in
this sort of case) is to search the code for any occurrences of
"import" (patterns such as r'^\s*import\s' and
r'^\s*from\s.*\simport\s' are useful here); it might reveal a package
not listed in the install requirements.

Best wishes,

   Julian



Lintian info message "hardening-no-bindnow" with vanilla debian/rules

2022-08-30 Thread Julian Gilbey
Hi!

A package I maintain within the team (python3-pyxdameraulevenshtein)
gives the following lintian message:

I: python3-pyxdameraulevenshtein: hardening-no-bindnow 
[usr/lib/python3/dist-packages/pyxdameraulevenshtein.cpython-310-x86_64-linux-gnu.so]

The debian/rules file is very bland, essentially:

%:
dh $@ --buildsystem=pybuild

and there is nothing about CFLAGS or the like in the setup.py file.
So if having this hardening flag enabled is a good thing, it should
probably be enabled somewhere within the pybuild system, rather than
every individual package with an extension file doing it.

Or have I missed something?

Best wishes,

   Julian



Re: Lintian info message "hardening-no-bindnow" with vanilla debian/rules

2022-08-31 Thread Julian Gilbey
On Tue, Aug 30, 2022 at 07:33:07PM +0200, Gregor Riepl wrote:
> > I: python3-pyxdameraulevenshtein: hardening-no-bindnow 
> > [usr/lib/python3/dist-packages/pyxdameraulevenshtein.cpython-310-x86_64-linux-gnu.so]
> > 
> > and there is nothing about CFLAGS or the like in the setup.py file.
> > So if having this hardening flag enabled is a good thing, it should
> > probably be enabled somewhere within the pybuild system, rather than
> > every individual package with an extension file doing it.
> 
> Hardening is generally a good thing, but can break code in subtle ways.
> I suppose that's why it was decided that enabling it by default in Debian
> was deemed too risky.
> 
> Enabling it is quite easy, though: Just add
> 
> export DEB_BUILD_MAINT_OPTIONS = hardening=+all
> [...]

Thanks Gregor, I'll try that!

> Also, note that hardening-no-bindnow is an Informational message, so not
> strictly something that needs to be acted upon:
> https://lintian.debian.org/tags/hardening-no-bindnow

Indeed, hence the title of this message :-)

Best wishes,

   Julian



Re: [Debian-salsa-ci] Enabling salsa-ci on all Debian Python Team repos

2022-09-19 Thread Julian Gilbey
On Mon, Sep 19, 2022 at 01:52:09PM +0200, Iñaki Malerba wrote:
> [...]
> > Perhaps there's an opportunity to automate and getting wider CI usage.
> 
> One of the biggest issues we had when a team adopted the pipeline was
> DDOSing of the instance because of the multiple pipelines generated when
> pusing the .gitlab-ci.yml file to all the projects.
> 
> If you're planning to do this, please:
> 
> - Use the API and configure the 'CI/CD configuration file' project
>   field, as you mentioned in the email. This won't generate a pipeline
>   when configured but only on the next push.

Indeed; setting the configuration file to
  recipes/debian.yml@salsa-ci-team/pipeline
will avoid any need to touch the actual repository.

> - If you need create the .gitlab-ci.yml file, please use the
>   `ci.skip`[1] push option.

And that should only be needed if the configuration is non-standard.

> Thanks, and good luck :)

Best wishes,

   Julian



Bug#1019435: ITP: rapidfuzz -- rapid fuzzy string matching

2022-09-09 Thread Julian Gilbey
Package: wnpp
Severity: wishlist
Owner: Julian Gilbey 
X-Debbugs-Cc: debian-de...@lists.debian.org, debian-de...@lists.debian.org, 
debian-python@lists.debian.org

* Package name: rapidfuzz
  Version : 2.6.1
  Upstream Author : Max Bachmann 
* URL : https://github.com/maxbachmann/RapidFuzz
* License : MIT
  Programming Lang: Python
  Description : rapid fuzzy string matching

RapidFuzz is a fast string matching library for Python and C++, which
uses the string similarity calculations from
[FuzzyWuzzy](https://github.com/seatgeek/fuzzywuzzy).  However there
are a couple of aspects that set RapidFuzz apart from FuzzyWuzzy:
1) It is MIT licensed so it can be used whichever License you might want to 
choose for your project, while you're forced to adopt the GPL license when 
using FuzzyWuzzy
2) It provides many string_metrics like hamming or jaro_winkler, which are not 
included in FuzzyWuzzy
3) It is mostly written in C++ and on top of this comes with a lot of 
Algorithmic improvements to make string matching even faster, while still 
providing the same results. For detailed benchmarks check the 
[documentation](https://maxbachmann.github.io/RapidFuzz/fuzz.html)
4) Fixes multiple bugs in the `partial_ratio` implementation

This is a dependency of the latest upstream release of
python3-textdistance.

There are also two C++ libraries contained within this package,
managed as separate GitHub subrepositories.  One is rapidfuzz-cpp, and
I am not sure whether to bundle this as a single package or whether to
package this independently.  The other is taskflow
(https://github.com/taskflow/taskflow)), a C++ header-only package; I
think this should probably be packaged separately.

This package will be maintained within the Python Packaging Team.



Re: Bug#1019435: ITP: rapidfuzz -- rapid fuzzy string matching

2022-09-09 Thread Julian Gilbey
On Fri, Sep 09, 2022 at 10:00:49AM +0100, Julian Gilbey wrote:
> Package: wnpp
> Severity: wishlist
> Owner: Julian Gilbey 
> X-Debbugs-Cc: debian-de...@lists.debian.org, debian-de...@lists.debian.org, 
> debian-python@lists.debian.org
> 
> * Package name: rapidfuzz
>   Version : 2.6.1
>   Upstream Author : Max Bachmann 
> * URL : https://github.com/maxbachmann/RapidFuzz
> * License : MIT
>   Programming Lang: Python
>   Description : rapid fuzzy string matching
> 
> RapidFuzz is a fast string matching library for Python and C++, which
> uses the string similarity calculations from
> [FuzzyWuzzy](https://github.com/seatgeek/fuzzywuzzy).  However there
> are a couple of aspects that set RapidFuzz apart from FuzzyWuzzy:
> 1) It is MIT licensed so it can be used whichever License you might want to 
> choose for your project, while you're forced to adopt the GPL license when 
> using FuzzyWuzzy
> 2) It provides many string_metrics like hamming or jaro_winkler, which are 
> not included in FuzzyWuzzy
> 3) It is mostly written in C++ and on top of this comes with a lot of 
> Algorithmic improvements to make string matching even faster, while still 
> providing the same results. For detailed benchmarks check the 
> [documentation](https://maxbachmann.github.io/RapidFuzz/fuzz.html)
> 4) Fixes multiple bugs in the `partial_ratio` implementation
> 
> This is a dependency of the latest upstream release of
> python3-textdistance.
> 
> There are also two C++ libraries contained within this package,
> managed as separate GitHub subrepositories.  One is rapidfuzz-cpp, and
> I am not sure whether to bundle this as a single package or whether to
> package this independently.  The other is taskflow
> (https://github.com/taskflow/taskflow)), a C++ header-only package; I
> think this should probably be packaged separately.
> 
> This package will be maintained within the Python Packaging Team.

I should have added: this package depends on Cython >= 3.0.0a7, so
this cannot be packaged until we have the new version of Cython
available.  The same applies for the JaroWinkler package.

   Julian



Bug#1019436: ITP: jarowinkler -- fast approximate string matching using Jaro(-Winkler) similarity

2022-09-09 Thread Julian Gilbey
Package: wnpp
Severity: wishlist
Owner: Julian Gilbey 
X-Debbugs-Cc: debian-de...@lists.debian.org, debian-python@lists.debian.org

* Package name: jarowinkler
  Version : 1.2.1
  Upstream Author : Max Bachmann 
* URL : https://github.com/maxbachmann/JaroWinkler
* License : MIT
  Programming Lang: Python
  Description : Fast approximate string matching using Jaro(-Winkler) 
similarity

JaroWinkler is a Python library to calculate the Jaro and Jaro-Winkler
similarity. It is easy to use, is much faster than other comparable
libraries, and is designed to integrate seemingless with RapidFuzz.

There is also a C++ library contained within this package, managed as
a separate GitHub subrepository.  This is jarowinkler-cpp, and I am not
sure whether to bundle this as a single package or whether to package
this independently.

This is a dependency of python3-rapidfuzz, which is a new dependency
of the latest upstream release of python3-textdistance.

This package will be maintained within the Python Packaging Team.



Bug#1019431: ITP: rapidfuzz-capi -- C-API of the Python RapidFuzz package

2022-09-09 Thread Julian Gilbey
Package: wnpp
Severity: wishlist
Owner: Julian Gilbey 
X-Debbugs-Cc: debian-de...@lists.debian.org, debian-python@lists.debian.org

* Package name: rapidfuzz-capi
  Version : 1.0.5
  Upstream Author : Max Bachmann 
* URL : https://github.com/maxbachmann/rapidfuzz_capi
* License : MIT
  Programming Lang: Python
  Description : C-API of the Python RapidFuzz package, used to build 
rapidfuzz

 This package provides the C API of RapidFuzz. It can be used inside
 `pyproject.toml` to compile an extension module extending RapidFuzz.
 Providing this C API in a separate package simplifies the build process.
 .
 This package is only needed for building packages such as python3-rapidfuzz
 and python3-jarowinkler; it is not needed at runtime.

This is a dependency of python3-rapidfuzz and python3-jarowinkler,
string-matching packages which are new (recursive) dependencies of the
latest upstream release of python3-textdistance.

This package will be maintained within the Python Packaging Team.



Re: Auto-handling of closed bugs - how does it work?

2022-08-14 Thread Julian Gilbey
On Sun, Aug 14, 2022 at 08:45:32AM +, Stefano Rivera wrote:
> Hi Julian (2022.08.14_07:18:49_+)
> > A question of curiosity: when I push a commit to salsa with a "Closes:
> > #n" in the changelog, the BTS gets a "tag: pending" notification.
> > I looked and looked, and could not find out how salsa does this?
> > Could anyone enlighten me?  (The standard debian-ci scripts, which the
> > repositories use for their CI, appear to only do something with RC
> > bugs.)
> 
> It's a salsa webhook:
> https://wiki.debian.org/Salsa/Doc#Dealing_with_Debian_BTS_from_commit_messages
> 
> We don't have tooling that automatically configures all the repos, but
> when we migrated to salsa, we set them all up for tagpending, and
> posting to #debian-python-changes on IRC

Ah, super, thanks!  I'll start adding those to the projects I've
created.

Best wishes,

   Julian



Cython 3.0.0

2022-08-14 Thread Julian Gilbey
Dear all,

I am intending to package a new dependency of textdistance called
rapizfuzz (along with its dependencies jarowinkler and rapizfuzz-capi,
and including rapizfuzz-cpp and jarowinkler-cpp within the packages).
It's relatively low priority though (and I haven't filed ITPs yet).
But it needs cython 3.0.0alpha7 or later to be able to compile.

There is talk of moving cython 3.0.0 into beta in the not-too-distant
future: https://github.com/cython/cython/issues/4022  It does have
some breaking changes in comparison to cython 0.29.x.

I wonder what our strategy should be?  Here are three reasonable
approaches:

(1) Keep the existing cython package (source: cython, binaries:
cython3, cython-doc, cython3-dbg) and have a new package for the 3.*
releases.

Advantages:
* won't break lots of existing packages

Disadvantages:
* no obvious name for new package
* will end up with an old cython package over time that cannot be
  easily dropped
* will lead to confusion - what is the cython3 package, is it the new
  or old version of cython?

(2) Create a new cython0.29 package (source: cython0.29, binaries:
cython3-0.29, cython0.29-doc, cython3-0.29-dbg for example) to house
the "old" version, and the cython package becomes cython 3.0.0

Advantages:
* clear naming scheme
* those packages which "just work" with the new version of cython will
  not need to do anything to migrate
* allows the cython0.29 package to be dropped in time without needing
  lots of renaming once no packages still rely on it

Disadvantages:
* there are two packages to maintain instead of just one (cython0.29
  and cython)
* those packages which don't work with 3.0.0 will either need patching
  or their dependency will need to be changed to cython3-0.29

(3) Let the cython package become cython 3.0.0 once it is released.

Advantages:
* only one package to maintain
* keep at the cutting edge of cython development

Disadvantages
* may break lots of packages, requiring a lot of effort to patch them


I don't know how many packages in Debian would be broken by the move
to 3.0.0; that may be something worth exploring.  It may well be that
approach (2) makes most sense for the short term.

I imagine that this is unlikely to hit before the bookworm freeze, but
I wanted to flag it up now.

Best wishes,

   Julian



Re: Auto-handling of closed bugs - how does it work?

2022-08-14 Thread Julian Gilbey
On Sun, Aug 14, 2022 at 11:38:10AM -0400, Sandro Tosi wrote:
> > It's a salsa webhook:
> > https://wiki.debian.org/Salsa/Doc#Dealing_with_Debian_BTS_from_commit_messages
> >
> > We don't have tooling that automatically configures all the repos, but
> > when we migrated to salsa, we set them all up for tagpending, and
> > posting to #debian-python-changes on IRC
> 
> shameless plug, i fixed most of the packages in our repo to have the
> proper wehbooks using
> https://github.com/sandrotosi/dpt-repos-check/blob/main/dpt-fix-integrations-webhooks.py
> (and now automation is available in pypi2deb, when you let it create
> the repo on salsa) -- consider new packages wont get the right setup
> old webhooks, etc
> 
> I should probably run it more periodically

Oh wow, that's awesome, thanks!!

Best wishes,

   Julian



Auto-handling of closed bugs - how does it work?

2022-08-14 Thread Julian Gilbey
A question of curiosity: when I push a commit to salsa with a "Closes:
#n" in the changelog, the BTS gets a "tag: pending" notification.
I looked and looked, and could not find out how salsa does this?
Could anyone enlighten me?  (The standard debian-ci scripts, which the
repositories use for their CI, appear to only do something with RC
bugs.)

Best wishes,

   Julian



Re: Cython 3.0.0

2022-08-14 Thread Julian Gilbey
On Sun, Aug 14, 2022 at 08:49:06AM +, Stefano Rivera wrote:
> Hi Julian (2022.08.14_07:41:26_+)
> > I don't know how many packages in Debian would be broken by the move
> > to 3.0.0; that may be something worth exploring.  It may well be that
> > approach (2) makes most sense for the short term.
> 
> I think that's the first question to answer. Once we know how bad
> the incompatibilities are, we can decide on the best approach.
> 
> So, first step is probably to package the new cython version (locally),
> and try to rebuild everything against it.

That sounds sensible, indeed, once the beta is released.  As cython is
used quite widely (240 packages or so in testing), I wonder whether it
would be appropriate to upload it to experimental and ask Lucas to run
the test builds across the archive?

Best wishes,

   Julian



Re: eric and jquery.js to a symbolic link

2023-01-05 Thread Julian Gilbey
On Thu, Jan 05, 2023 at 07:14:40PM +, Guðjón Guðjónsson wrote:
> Hi list
> I am working on eric and I do have a problem with the lintian requirement to
> replace the jquery.js file with the debian provided jquery.js file.
> The upstream author pointed out that it doesn't work as well and I have 
> verified
> the behavior.
> If you run eric7_browser and press ==->Bookmarks->Speed Dial it doesn't show 
> the
> links when using the debian jquery.js file.
> Is it ok to keep the original jquery files and add a lintian-override in this
> case?
> Regards
> Gudjon

Hi Gudjon,

Looking at it, the problem seems to be that eric is using an ancient
version of jQuery (1.7.1, released Nov 22, 2011; 1.7.2 was released on
Mar 21, 2012).  jQuery-UI 1.8.16 is similarly old (released Aug 15,
2011, 1.8.17 was released on Nov 29, 2011).  You could embed it (but
not the minimised version in the upstream eric sources, but rather the
original source from github.com/jquery/{jquery,jquery-ui}, but it
would be far from ideal - who knows how many bugs or possible security
issues there are in such an old version?  A much better solution, if
feasible, is to ask the eric upstream to switch to a recent version of
jQuery and jQuery UI, and to update the code depending on it
accordingly.  If upstream won't do that, then we should.

Best wishes,

   Julian



Re: Fixing upstream branch after tagging

2022-12-05 Thread Julian Gilbey
On Mon, Dec 05, 2022 at 06:24:48AM +, Guðjón Guðjónsson wrote:
> Hi list
> I am working on eric and I made a mistake while updating the git repository.
> Some paths have changed so files were not excluded correctly and now upstream
> and pristine-tar contain jquery*.js files.
> How can I remove the files after having tagged?
> I read that the pristine-tar branch should be removed [1]. Is that correct?
> Regards
> Gudjon

Hi Gudjon,

It depends on whether you have pushed to a remote repository yet, or
whether it's still only on your local machine.  If you've already
pushed, then you'll have to update your local versions and give it a
different version number (for example, +ds2 rather than +ds1), doing a
fresh gbp import-orig on the repacked source package.

If you're still only on your local machine, this is an error I have
made a number of times, only noticing after doing gpb import-orig.  To
fix it, you can roll back the gbp import-orig.  With care, do the
following (where git co is shorthand for git checkout):

git co debian/unstable  [or whatever your branch is]
git log
git reset --hard 

git co upstream
git log
git reset --hard 

git co pristine-tar
git log
git reset --hard 

git tag -d upstream/


There is probably a better way to do it, but this has worked for me.

Good luck!

   Julian



Re: Python 3.11 for bookworm?

2022-12-18 Thread Julian Gilbey
On Thu, Dec 15, 2022 at 04:10:05PM +0100, Thomas Goirand wrote:
> On 12/13/22 13:34, Julian Gilbey wrote:
> > If Python 3.11 is the default, then it is highly likely that Spyder
> > will not be included: debugpy, which is a dependency of Spyder and
> > python3-ipykernel (and lots of things that depend on that) seems to
> > require major work upstream to make it fully compatible with Python
> > 3.11.  This is work in progress, but I don't know whether it will be
> > ready in time for the freeze.  At the moment, I have worked around
> > this problem by just skipping the failing tests, but that is far from
> > an ideal solution.
> 
> It's probably ok if it's a *TEMPORARY* solution until upstream fixes
> everything in time for the release (which is months after the freeze). The
> question is: do you believe this may happen for let's say next March?

The truth is that I don't know.  Upstream is very good, but the Python
3.11 internal changes are very significant, and debugpy (along with
pydevd, which is part of debugpy) are deeply affected by this, as they
work at the level of Python's internals.  So I don't know how long it
will take them to make the required changes (and it's far beyond my
capability or capacity to do myself).  I can hope they'll do it in
time for the freeze, but I wouldn't like to place a bet on it.

Best wishes,

   Julian



Re: Python 3.11 for bookworm?

2022-12-19 Thread Julian Gilbey
On Sun, Dec 18, 2022 at 06:02:58PM +, Julian Gilbey wrote:
> On Thu, Dec 15, 2022 at 04:10:05PM +0100, Thomas Goirand wrote:
> > On 12/13/22 13:34, Julian Gilbey wrote:
> > > If Python 3.11 is the default, then it is highly likely that Spyder
> > > will not be included: debugpy, which is a dependency of Spyder and
> > > python3-ipykernel (and lots of things that depend on that) seems to
> > > require major work upstream to make it fully compatible with Python
> > > 3.11.  This is work in progress, but I don't know whether it will be
> > > ready in time for the freeze.  At the moment, I have worked around
> > > this problem by just skipping the failing tests, but that is far from
> > > an ideal solution.
> > 
> > It's probably ok if it's a *TEMPORARY* solution until upstream fixes
> > everything in time for the release (which is months after the freeze). The
> > question is: do you believe this may happen for let's say next March?
> 
> The truth is that I don't know.  Upstream is very good, but the Python
> 3.11 internal changes are very significant, and debugpy (along with
> pydevd, which is part of debugpy) are deeply affected by this, as they
> work at the level of Python's internals.  So I don't know how long it
> will take them to make the required changes (and it's far beyond my
> capability or capacity to do myself).  I can hope they'll do it in
> time for the freeze, but I wouldn't like to place a bet on it.

Quick update: with the updating of python3-bytecode from 0.13 to 0.14
in unstable/testing, which allows it to handle Python 3.11, something
has changed and now pydevd doesn't even pass the tests on Python
3.10.  The python3-bytecode underwent a major restructuring, so it is
entirely possible that something has changed that wasn't part of the
advertised API or something like that.  But that's for upstream pydevd
developers to deal with.

One possibility is that we revert to the situation in bullseye if
pydevd is not ready in time for the freeze.  Bullseye didn't have
bytecode/pydevd/debugpy, and it meant that debugging was limited in
Spyder: we remove python3-debugpy from the dependencies of
python3-ipykernel and then the rest of the dependency chain will be
OK.  It's certainly not an ideal solution, but it may be the best we
can do if we're sticking with Python 3.11.  It may actually be worth
doing that at this point so that Spyder can stay in testing for the
time being, even though pydevd and debugpy won't.

Best wishes,

   Julian



Re: Python 3.11 for bookworm?

2022-12-19 Thread Julian Gilbey
Hi Jochen,

On Mon, Dec 19, 2022 at 04:53:58PM +0100, Jochen Sprickerhof wrote:
> Hi Julian,
> 
> * Julian Gilbey  [2022-12-19 09:41]:
> > Quick update: with the updating of python3-bytecode from 0.13 to 0.14
> > in unstable/testing, which allows it to handle Python 3.11, something
> > has changed and now pydevd doesn't even pass the tests on Python
> > 3.10.  The python3-bytecode underwent a major restructuring, so it is
> > entirely possible that something has changed that wasn't part of the
> > advertised API or something like that.  But that's for upstream pydevd
> > developers to deal with.
> 
> I've uploaded 0.14.0-2 that should fix this. As far as I've found that was
> only a minor fix in the Debian specific offset patch, sorry for the trouble.

Phew!  I didn't think to check that.  Unfortunately, though, there are
still numerous pydevd test errors even with 0.14.0-2, so I think
something has changed in bytecode that the pydevd maintainers will
have to adapt to.  So either I skip 14 newly failing tests on Python
3.10 (they're mostly skipped on 3.11 as the current pydevd version
skips bytecode tests on Python 3.11) or wait for a new version of
pydevd.  Hmmm.

Anyway, thanks so much for all your work updating this package - it's
been really helpful, as I've been a bit overloaded and Spyder 5.4.0
together with the Python 3.11 transition has been a lot to handle.  I
also learnt a lot from your changes!

Best wishes,

   Julian



  1   2   >