Re: Bug#972213: boost1.71: Please indicate some way which python versions you support

2020-10-16 Thread Drew Parsons

On 2020-10-16 14:43, Giovanni Mascellani wrote:

Hi,

Il 16/10/20 02:53, Drew Parsons ha scritto:


Would it make sense to use the Built-Using [1] header?

...

[1]
https://www.debian.org/doc/debian-policy/ch-relationships.html#additional-source-packages-used-to-build-the-binary-built-using


The precise web page you are linking hints that this use of Built-Using
would be improper:

"This field should be used only when there are license or DFSG
requirements to retain the referenced source packages. It should not be
added solely as a way to locate packages that need to be rebuilt 
against

newer versions of their build dependencies".


Ah yes, I had a feeling there were more reasons to not prefer this 
method!




That said, I forgot to mention that the Python versions Boost is
compiled against is also tracked in the package names provided by
libboost-python1.71.0, which are currently libboost-python1.71.0-py38
and libboost-python1.71-py39.

Is this better? More in general, there can be dozens of ways to
advertise which Python versions are used to build Boost.Python, but it
is not clear to me how this information should be consumed.



That sounds like a useful approach.

The ecflow build could access the provided versions via
  dpkg -s libboost-python${BOOST_VERSION} | grep Provides



Bug#972213: boost1.71: Please indicate some way which python versions you support

2020-10-15 Thread Drew Parsons
Source: boost1.71
Followup-For: Bug #972213
X-Debbugs-Cc: debian-python@lists.debian.org

Would it make sense to use the Built-Using [1] header?
e.g.
  Built-Using: python3.8 python3.9

dh_python3 knows if the module includes extensions
(*_python*.so.) and could inject the pythons into Built-Using.

dh_golang does something like this for the Go packages. There's a
nuance which I don't fully understand about the intended use of
Built-Using, which means it's not really the proper solution for the
Go packages.  I think that's related to the fact that Go applications
are statically linked.  The Python extensions are dynamically linked
so maybe Built-Using could work fine.

[1]
https://www.debian.org/doc/debian-policy/ch-relationships.html#additional-source-packages-used-to-build-the-binary-built-using



Bug#971573: RFP: python-cppimport -- import C or C++ files directly from Python

2020-10-01 Thread Drew Parsons
Package: wnpp
Severity: wishlist
X-Debbugs-Cc: debian-python@lists.debian.org

* Package name: python-cppimport
  Version : 17.09.18
  Upstream Author : Ben Thompson 
* URL : https://github.com/tbenthompson/cppimport
* License : MIT
  Programming Lang: Python
  Description : import C or C++ files directly from Python

Sometimes Python just isn't fast enough. Or you have existing code in
a C++ library.  cppimport combines the process of compiling and
importing an extension in Python so that you can type modulename =
cppimport.imp("modulename") and not have to worry about multiple
steps.

cppimport looks for a C or C++ source file that matches the requested
module. If such a file exists, the file is first run through the Mako
templating system. The compilation options produced by the Mako pass
are then use to compile the file as a Python extension. The extension
(shared library) that is produced is placed in the same folder as the
C++ source file. Then, the extension is loaded.

Most cppimport users combine it with pybind11, but you can use a range
of methods to create your Python extensions. Raw C extensions,
Boost.Python, SWIG all work.




cppimport is used in some dolfinx tests.



Probably best maintained under the Debian Python Team.



Bug#970811: RFP: python3-sphinx-sitemap -- sphinx sitemap generator extension

2020-09-23 Thread Drew Parsons
Package: wnpp
Severity: wishlist
X-Debbugs-Cc: debian-python@lists.debian.org
Control: block 962440 by -1

* Package name: python3-sphinx-sitemap
  Version : 2.2.0
  Upstream Author : Jared Dillard 
* URL : https://github.com/jdillard/sphinx-sitemap
* License : MIT
  Programming Lang: Python
  Description : sphinx sitemap generator extension

A Sphinx extension to generate multiversion and multilanguage
sitemaps.org compliant sitemaps for the HTML version of your Sphinx
documentation.


This package is required in order to build the docs for MDAnalysis.


To be maintained under the Python Package Team alonside other sphinx
extensions.



Bug#968844: ITP: python-meshplex -- fast tools for simplex meshes

2020-08-22 Thread Drew Parsons
Package: wnpp
Severity: wishlist
Owner: Drew Parsons 
X-Debbugs-Cc: debian-de...@lists.debian.org, debian-python@lists.debian.org, 
debian-scie...@lists.debian.org

* Package name: python-meshplex
  Version : 0.13.2
  Upstream Author : Nico Schlömer 
* URL : https://github.com/nschloe/meshplex
* License : GPL-3
  Programming Lang: Python
  Description : fast tools for simplex meshes

Compute all sorts of interesting points, areas, and volumes in
triangular and tetrahedral meshes, with a focus on efficiency. Useful
in many contexts, e.g., finite-element and finite-volume computations.

Required by dmsh and other tools created by Nico Schlömer.

To be maintained under the Debian Science Team alongside dmsh and
meshio.


Bug#966918: ITP: pygmsh -- combine the power of Gmsh with the versatility of Python

2020-08-03 Thread Drew Parsons
Package: wnpp
Severity: wishlist
Owner: Drew Parsons 
X-Debbugs-Cc: debian-de...@lists.debian.org, debian-scie...@lists.debian.org, 
debian-python@lists.debian.org

* Package name: pygmsh
  Version : 6.1.1
  Upstream Author : Nico Schlömer 
* URL : https://github.com/nschloe/pygmsh
* License : GPL3
  Programming Lang: Python
  Description : combine the power of Gmsh with the versatility of Python

Gmsh is a powerful mesh generation tool with a scripting language that
is notoriously hard to write.

The goal of pygmsh is to combine the power of Gmsh with the
versatility of Python and to provide useful abstractions from the Gmsh
scripting language so you can create complex geometries more easily.

Used by dolfinx (in tests).

To be maintained with the Debian Science Team alongside pygalmesh.


Bug#966019: RFP: python-gsd -- native file format for HOOMD-blue

2020-07-22 Thread Drew Parsons
Package: wnpp
Severity: wishlist
X-Debbugs-Cc: debichem-de...@lists.alioth.debian.org, 
debian-scie...@lists.debian.org, debian-python@lists.debian.org
Control: block 962440 by -1

* Package name: python-gsd
  Version : 2.1.2
  Upstream Author : Joshua A. Anderson et al, University of Michigan
* URL : https://gsd.readthedocs.io
* License : BSD-2
  Programming Lang: Python
  Description : native file format for HOOMD-blue

The GSD file format is the native file format for HOOMD-blue. GSD
files store trajectories of the HOOMD-blue system state in a binary
file with efficient random access to frames. GSD allows all particle
and topology properties to vary from one frame to the next. Use the
GSD Python API to specify the initial condition for a HOOMD-blue
simulation or analyze trajectory output with a script. Read a GSD
trajectory with a visualization tool to explore the behavior of the
simulation.

Required by mdanalysis.

Suitable for package within the Debichem team, but the Debian Science
Team or Debian Python Module Team could also maintain it.



Bug#963605: RFP: python-language-server -- Python implementation of the Language Server Protocol

2020-06-24 Thread Drew Parsons
Package: wnpp
Severity: wishlist
Control: block 946035 by -1
Control: block 946451 by -1

* Package name: python-language-server
  Version : 0.33.3
  Upstream Author : Palantir Technologies, Inc
* URL : https://github.com/palantir/python-language-server
* License : MIT
  Programming Lang: Python
  Description : Python implementation of the Language Server Protocol

The language server enables development IDEs such as Spyder provide
Auto Completions, Code Linting, Definitions, Hover, References,
Signature Help, Document Symbols, Document Formatting.

Required in order to upgrade the popular Python IDE Spyder to its
latest version.

Ideally maintained by the Python Modules Team, but may be claimed by
other teams (the Python Applications Packaging Team already maintains
fortran-language-server).



Re: RFS: symfit/0.5.2-1 [ITP] -- Symbolic Fitting in Python, fitting as it should be

2020-06-23 Thread Drew Parsons

On 2020-06-24 01:57, Stephan Lachnit wrote:

-BEGIN PGP MESSAGE-
Version: ProtonMail

wcFMAysU9YM04hChAQ//cTKRX2ovcrbEV2oEgObcf8/pwIk/p6l0K7sqm2Im
NHnk1mPaSg1VqIdK+QlqGgomxU/oZDSicnjRI1a7dNRvbJuOvMDKMqacQUIc
zJWXCdOxBw361ut+2LrcbwgaMFcW/fuXPbFFt3k5cWNVv8+qBtSFq/VwsSMs



Heh that's kind of funny if your mail client is automatically encrypting 
to me, but not encrypting to the mailing list. Seems to defeat the point 
of encrypting... still, we should all be using it more.


It looks like you've added --with sphinxdoc to dh, but haven't actually 
triggered the doc build itself (dh --with sphinxdoc doesn't *build* the 
docs, it just manages the installation of them, and the value of 
${sphinxdoc:Depends})


See the commented part of dh_make's debian/rules.  Uncomment to get the 
docs building.
Or you can also do it just with cd docs; make html.  I'm not sure why 
dh_make uses a more complicated method. Maybe not all python modules 
provide a docs/Makefile.


Building man pages fails (with either method), but you don't need a 
manpage for a python module with no binary executable. HTML is enough.


Note the doc-base file in the dh_make templates.  It's nice to use this 
to register the docs (cf. file:///usr/share/doc/HTML/index.html)


Drew



Re: RFS: symfit/0.5.2-1 [ITP] -- Symbolic Fitting in Python, fitting as it should be

2020-06-22 Thread Drew Parsons

On 2020-06-23 09:55, Drew Parsons wrote:

Nice work.  I can sponsor this.

Drew


On 2020-06-23 01:00, Stephan Lachnit wrote:

Package: sponsorship-requests
Severity: wishlist

Dear mentors,

I am looking for a sponsor for my package "symfit"

 * Package name: symfit
   Version : 0.5.2-1
   Upstream Author : Martin Roelfs 
 * URL : https://github.com/tBuLi/symfit
 * License : GPL-2.0-or-later
 * Vcs : https://salsa.debian.org/stephanlachnit/symfit
   Section : python

It builds those binary packages:

  python3-symfit - Symbolic Fitting in Python, fitting as it should be




Hi Stephan, you haven't got a doc package in place yet for symfit.  But 
the docs build cleanly, the template generated for dh_make gives the 
pattern for python packages like this.  Would you like to add this 
before update?


Setting up debian/tests can be useful too. With pytests it shouldn't be 
hard to do. (2 tests currently fail but should be straightforward to fix 
or work around).


debian/tests can be added later, but probably a good idea to add the doc 
package now. Otherwise it'll be waiting in the NEW queue twice.


Drew



Re: RFS: symfit/0.5.2-1 [ITP] -- Symbolic Fitting in Python, fitting as it should be

2020-06-22 Thread Drew Parsons

Nice work.  I can sponsor this.

Drew


On 2020-06-23 01:00, Stephan Lachnit wrote:

Package: sponsorship-requests
Severity: wishlist

Dear mentors,

I am looking for a sponsor for my package "symfit"

 * Package name: symfit
   Version : 0.5.2-1
   Upstream Author : Martin Roelfs 
 * URL : https://github.com/tBuLi/symfit
 * License : GPL-2.0-or-later
 * Vcs : https://salsa.debian.org/stephanlachnit/symfit
   Section : python

It builds those binary packages:

  python3-symfit - Symbolic Fitting in Python, fitting as it should be

To access further information about this package, please visit the
following URL:

  https://mentors.debian.net/package/symfit

Alternatively, one can download the package with dget using this 
command:


  dget -x
https://mentors.debian.net/debian/pool/main/s/symfit/symfit_0.5.2-1.dsc

Changes since the last upload:

   * Initial release. (Closes: #963503)

Regards,

--
  Stephan Lachnit






Bug#962691: RFP: python3-griddataformats -- handle data on a regular grid for molecular simulations

2020-06-11 Thread Drew Parsons
Package: wnpp
Severity: wishlist
Control: block 962440 by -1

* Package name: python3-griddataformats
  Version : 0.5.0
  Upstream Author : Oliver Beckstein 
* URL : https://www.mdanalysis.org/GridDataFormats/
* License : LGPL3+
  Programming Lang: Python
  Description : handle data on a regular grid for molecular simulations

GridDataFormats is a pure Python library to handle data on a regular
grid using commonly used file formats in molecular simulations.

The gridData module contains a simple class Grid that makes it easier
to work with data on a regular grid. A limited number of commonly used
formats can be read and written.


Required by mdanalysis (ITP#962440).

Suitable for maintainence by the Debichem Team (alongside mdanalysis)
but could be taken up by other Teams.



Bug#962690: RFP: python3-mmtf -- binary encoding of biological structures

2020-06-11 Thread Drew Parsons
Package: wnpp
Severity: wishlist
Control: block 962440 by -1

* Package name: python3-mmtf
  Version : 1.1.2
  Upstream Author : Anthony Bradley 
* URL : https://github.com/rcsb/mmtf-python
* License : Apache2
  Programming Lang: Python
  Description : binary encoding of biological structures (Python 3)

The python implementation of the MMTF API, decoder and encoder.
http://mmtf.rcsb.org/

The macromolecular transmission format (MMTF) is a binary encoding of
biological structures.

Required by mdanalysis (ITP#962440).

Suitable for team maintenance under Debichem Team (alongside
libmmtf-java) but other teams might also take it on.



Bug#962683: RFP: python3-msmb-theme -- applies slight modifications to sphinx_rtd_theme

2020-06-11 Thread Drew Parsons
Package: wnpp
Severity: wishlist
Control: block 962442 by -1

* Package name: python3-msmb-theme
  Version : 1.2.0
  Upstream Author : Dave Snider
* URL : https://github.com/msmbuilder/msmb_theme
* License : MIT
  Programming Lang: Python
  Description : applies slight modifications to sphinx_rtd_theme

This applies slight modifications to sphinx_rtd_theme. It needs the
aforementioned theme to be installed.

Modifications:
  Styling tweaks in msmb.css
  Styling for Jupyter notebooks

Required by mdtraj (ITP#962442) (for generating docs with sphinx).

To be maintained by the Python Modules Team on behalf of the Debichem
Team.



Re: Bug#962341: Acknowledgement (ITP: ruamel.yaml.clib -- C based reader/scanner and emitter for ruamel.yaml)

2020-06-06 Thread Drew Parsons

Control: merge 955282 962341
thanks

Sorry, didn't check soon enough, Michael Crusoe is already working on 
ruamel.yaml.clib.


Drew



Bug#962341: ITP: ruamel.yaml.clib -- C based reader/scanner and emitter for ruamel.yaml

2020-06-06 Thread Drew Parsons
Package: wnpp
Severity: wishlist
Owner: Drew Parsons 

* Package name: ruamel.yaml.clib
  Version : 0.2.0
  Upstream Author : Anthon van der Neut 
* URL : https://sourceforge.net/projects/ruamel-yaml-clib/
* License : MIT
  Programming Lang: Python/C
  Description : C based reader/scanner and emitter for ruamel.yaml

This package was split of from ruamel.yaml, so that ruamel.yaml can be
build as a universal wheel. Apart from the C code seldom changing, and
taking a long time to compile for all platforms, this allows
installation of the .so on Linux systems under /usr/lib64/pythonX.Y
(without a .pth file or a ruamel directory) and the Python code for
ruamel.yaml under /usr/lib/pythonX.Y.

Required by ruamel.yaml >= 0.16.8.

To be maintained by the Python Modules Team.



Bug#962340: RFP: python3-palettable -- a library of color palettes for Python

2020-06-06 Thread Drew Parsons
Package: wnpp
Severity: wishlist
Control: block 962268 by -1

* Package name: python3-palettable
  Version : 3.3.0
  Upstream Author : Matt Davis
* URL : https://jiffyclub.github.io/palettable/
* License : MIT-like
  Programming Lang: Python
  Description : a library of color palettes for Python

Palettable (formerly brewer2mpl) is a library of color palettes for
Python. It's written in pure Python with no dependencies, but it can
supply color maps for matplotlib. You can use Palettable to customize
matplotlib plots or supply colors for a web application.

Required by pymatgen.

To be team maintained under the Python Modules Team.



Bug#962338: RFP: monty -- the missing complement to Python

2020-06-06 Thread Drew Parsons
Package: wnpp
Severity: wishlist
Control: block 962268 by -1

* Package name: monty
  Version : 3.0.2
  Upstream Author : Materials Virtual Lab
* URL : https://github.com/materialsvirtuallab/monty
* License : MIT
  Programming Lang: Python
  Description : the missing complement to Python

Monty is the missing complement to Python. Monty implements
supplementary useful functions for Python that are not part of the
standard library. Examples include useful utilities like transparent
support for zipped files, useful design patterns such as singleton and
cached_class, and many more.

Python is a great programming language and comes with “batteries
included”. However, even Python has missing functionality and/or
quirks that make it more difficult to do many simple tasks. In the
process of creating several large scientific frameworks based on
Python, my co-developers and I have found that it is often useful to
create reusable utility functions to supplement the Python standard
library. Our forays in various developer sites and forums also found
that many developers are looking for solutions to the same problems.

Monty is created to serve as a complement to the Python standard
library. It provides suite of tools to solve many common problems, and
hopefully, be a resource to collect the best solutions.


Required dependency for pymatgen.

To be maintained under the Python Modules Team 
on behalf of the Debichem Team.


Bug#959407: dh-python: pybuild without setup.py

2020-05-01 Thread Drew Parsons
Package: dh-python
Version: 4.20200315
Severity: normal


There are movements around upstream packages to stop using setup.py.
https://stackoverflow.com/questions/58753970/how-to-build-a-source-distribution-without-using-setup-py-file

PEP517 seems to be the culprit behind this movement,
https://pypi.org/project/pep517/


If this is going to become the Way Of The Future,
then pybuild is going to want to support it.

Drew



Bug#948699: ITP: pytest-mpi -- a plugin for pytest testing MPI-related code

2020-01-11 Thread Drew Parsons
Package: wnpp
Severity: wishlist
Owner: Drew Parsons 

* Package name: pytest-mpi
  Version : 0.3
  Upstream Author : James Tocknell
* URL : https://github.com/aragilar/pytest-mpi
* License : BSD
  Programming Lang: Python
  Description : a plugin for pytest testing MPI-related code

pytest-mpi provides a number of things to assist with using pytest
with MPI-using code, specifically:
 - Displaying of the current MPI configuration (e.g. the MPI
 version, the number of processes)
 - Sharing temporary files/folders across the MPI processes
 - Markers which allow for skipping or xfailing tests
based on whether the tests are being run under MPI

Further documentation can be found at https://pytest-mpi.readthedocs.io.


This plugin is required by future releases of h5py to run h5py tests
on MPI functionality, see https://github.com/h5py/h5py/pull/1255

To be maintained under the Debian Python Modules Team, alongside pytest.



Bug#946625: scipy: autopkgtest regularly times out

2019-12-27 Thread Drew Parsons
Source: scipy
Followup-For: Bug #946625
Control: tags -1 help

The rules for skipping the failing tests
(test_sparsetools.TestInt32Overflow) are already in place in
debian/tests/python3

It looks like something must have changed in pytest such that the skip
instructions are now being ignored.

Needs a pytest syntax expert to fix.



Bug#946035: spyder: RFH: needs Build-Depends: python-language-server, qdarkstyle

2019-12-03 Thread Drew Parsons
Package: spyder
Version: 4.0.0~rc3+dfsg1-1
Severity: normal

spyder 4 (currently on salsa in experimental branch) needs python
modules python-language-server and qdarkstyle

These are currently not packaged for Debian, so this is a
Request For Help to get them packaged so we can proceed with the
spyder upgrade.

-- System Information:
Debian Release: bullseye/sid
  APT prefers unstable
  APT policy: (500, 'unstable'), (1, 'experimental')
Architecture: amd64 (x86_64)
Foreign Architectures: i386

Kernel: Linux 5.3.0-2-amd64 (SMP w/4 CPU cores)
Kernel taint flags: TAINT_OOT_MODULE, TAINT_UNSIGNED_MODULE
Locale: LANG=en_AU.UTF-8, LC_CTYPE=en_AU.UTF-8 (charmap=UTF-8), 
LANGUAGE=en_AU:en (charmap=UTF-8)
Shell: /bin/sh linked to /bin/dash
Init: systemd (via /run/systemd/system)
LSM: AppArmor: enabled

Versions of packages spyder depends on:
ii  python3-spyder  4.0.0~rc3+dfsg1-1
ii  python3.7   3.7.5-2

spyder recommends no packages.

spyder suggests no packages.

Versions of packages python3-spyder depends on:
ii  libjs-jquery  3.3.1~dfsg-3
ii  libjs-mathjax 2.7.4+dfsg-1
ii  pylint2.4.4-1
ii  python3   3.7.5-3
ii  python3-atomicwrites  1.1.5-2
ii  python3-chardet   3.0.4-4
ii  python3-cloudpickle   1.2.1-2
ii  python3-diff-match-patch  2018-2
ii  python3-intervaltree  3.0.2-1
ii  python3-jedi  0.14.1-1
ii  python3-keyring   18.0.1-1
ii  python3-nbconvert 5.6.0-2
ii  python3-numpydoc  0.7.0-1
ii  python3-pexpect   4.6.0-1
ii  python3-pickleshare   0.7.5-1
ii  python3-psutil5.6.7-1
ii  python3-pygments  2.3.1+dfsg-1
ii  python3-pympler   0.7+dfsg1-1
ii  python3-qtawesome 0.4.4+ds1-3
ii  python3-qtconsole 4.3.1-1
ii  python3-qtpy  1.3.1-3
ii  python3-sphinx1.8.5-3
ii  python3-spyder-kernels1.8.0-1
ii  python3-watchdog  0.9.0-3
ii  python3-zmq   17.1.2-3
ii  spyder-common 4.0.0~rc3+dfsg1-1

Versions of packages python3-spyder suggests:
ii  cython3 0.29.14-0.1+b1
ii  python3-matplotlib  3.0.2-2+b2
ii  python3-numpy   1:1.17.4-3
ii  python3-pandas  0.25.3+dfsg-4
ii  python3-pil 6.2.1-2+b1
ii  python3-scipy   1.3.3-1
ii  python3-sympy   1.4-1

Versions of packages python3-pyqt5 depends on:
ii  libc6 2.29-3
ii  libgcc1   1:9.2.1-21
ii  libpython3.7  3.7.5-2
ii  libqt5core5a [qtbase-abi-5-12-5]  5.12.5+dfsg-2
ii  libqt5dbus5   5.12.5+dfsg-2
ii  libqt5designer5   5.12.5-2
ii  libqt5gui55.12.5+dfsg-2
ii  libqt5help5   5.12.5-2
ii  libqt5network55.12.5+dfsg-2
ii  libqt5printsupport5   5.12.5+dfsg-2
ii  libqt5test5   5.12.5+dfsg-2
ii  libqt5widgets55.12.5+dfsg-2
ii  libqt5xml55.12.5+dfsg-2
ii  libstdc++69.2.1-21
ii  python3   3.7.5-3
ii  python3-sip [sip-py3api-12.7] 4.19.19+dfsg-2+b1

Versions of packages python3-pyqt5 suggests:
ii  python3-pyqt5-dbg  5.12.3+dfsg-3+b1

-- no debconf information



Re: python-urllib3 1.25.6 uploaded to experimental (closes CVE-2019-11236) but fails build tests

2019-10-27 Thread Drew Parsons

On 2019-10-27 23:13, Daniele Tricoli wrote:

On Sun, Oct 13, 2019 at 10:31:31PM +0800, Drew Parsons wrote:
It conditionally works.  Using curl, I found that TLSv1_0 or TLSv1_1 
will
support a successful connection, but only if the maximum SSL_VERSION 
is
constrained to TLSv1_0 or TLSv1_1 (e.g. curl -v --tlsv1.1 --tls-max 
1.1

https://pub.orcid.org). Without the max, the connection fails:
$ curl --tlsv1.1  https://pub.orcid.org
curl: (35) error:14094410:SSL routines:ssl3_read_bytes:sslv3 alert 
handshake

failure

The urllib3 failure was similar, but I do not know how to set tls-max 
with
urllib3. I could only find the option with curl.  I could set up a 
custom
HTTPAdapter as suggested at 
https://requests.readthedocs.io/en/master/user/advanced/#example-specific-ssl-version
to set ssl_version=ssl.PROTOCOL_TLSv1_1 but the ssl module doesn't 
have the
SSLVERSION_MAX_TLSv1_1 value that curl has. I could solve it with 
pycurl

using c.setopt(pycurl.SSLVERSION, pycurl.SSLVERSION_TLSv1_1 |
pycurl.SSLVERSION_MAX_TLSv1_1)


For sure I'm missing something, but why not just set TLS version?
I tried the following on both Python2 and Python3:

>>> import ssl
>>> from urllib3.poolmanager import PoolManager
>>> http = PoolManager(ssl_version=ssl.PROTOCOL_TLSv1)
>>> r = http.request('GET', 'https://pub.orcid.org')
>>> r.status
200



That's a good tip, I missed that permutation. I was originally trying to 
access using the requests module, so didn't think to do it directly with 
urllib.PoolManager





Evidently the orcid server only supports TLSv1.0 and TLSv1.1 and no 
higher
(why haven't they activated TLSv1.3 yet?!), while curl and urllib3 
without
tls-max first test TLSv1.3 and then quit without cascading downwards 
once
they receive the TLSv1.3 handshake failure.  Which is rather odd 
behaviour
when I think about it.  The whole point of supporting multiple 
protocol
versions is to try the next available version if the first one doesn't 
work.


Not an expert here, but I think fallback is not done on purpose due 
downgrade

attacks: https://en.wikipedia.org/wiki/Downgrade_attack



I see. Still an odd kind of protection though.  The attacker can just 
downgrade themselves.



I had a closer look.  The failing tests were in python2 only, coming 
from

the non-ascii (Gërman http://Königsgäßchen.de/straße and Japanese
http://ヒ:キ@ヒ.abc.ニ/ヒ?キ#ワ) unicode url tests.

...

Fixed adding python{,3}-idna on B-D. I had to add python3-idna
because the same tests were failing also on Python3 when I tested
them buinding on DoM.



Thanks for that, and thanks again for the PoolManager tip.

Drew



Re: python-urllib3 1.25.6 uploaded to experimental (closes CVE-2019-11236) but fails build tests

2019-10-13 Thread Drew Parsons

Daniele wrote:
I hope to have the time to investigate also this: 
urllib3/contrib/pyopenssl.py
contains code to have SSL with SNI_-support for Python 2 and it depends 
on
pyOpenSSL, cryptography and idna. Maybe looking at them can give us 
more clues.


Also, could you see if using Python3 the connection to 
https://pub.orcid.org work?


It conditionally works.  Using curl, I found that TLSv1_0 or TLSv1_1 
will support a successful connection, but only if the maximum 
SSL_VERSION is constrained to TLSv1_0 or TLSv1_1 (e.g. curl -v --tlsv1.1 
--tls-max 1.1  https://pub.orcid.org). Without the max, the connection 
fails:

$ curl --tlsv1.1  https://pub.orcid.org
curl: (35) error:14094410:SSL routines:ssl3_read_bytes:sslv3 alert 
handshake failure


The urllib3 failure was similar, but I do not know how to set tls-max 
with urllib3. I could only find the option with curl.  I could set up a 
custom HTTPAdapter as suggested at 
https://requests.readthedocs.io/en/master/user/advanced/#example-specific-ssl-version 
to set ssl_version=ssl.PROTOCOL_TLSv1_1 but the ssl module doesn't have 
the SSLVERSION_MAX_TLSv1_1 value that curl has. I could solve it with 
pycurl using c.setopt(pycurl.SSLVERSION, pycurl.SSLVERSION_TLSv1_1 | 
pycurl.SSLVERSION_MAX_TLSv1_1)


Evidently the orcid server only supports TLSv1.0 and TLSv1.1 and no 
higher (why haven't they activated TLSv1.3 yet?!), while curl and 
urllib3 without tls-max first test TLSv1.3 and then quit without 
cascading downwards once they receive the TLSv1.3 handshake failure.  
Which is rather odd behaviour when I think about it.  The whole point of 
supporting multiple protocol versions is to try the next available 
version if the first one doesn't work.




Th package build was successful on my system but gives build-time 
errors in

chroot (on buildd).  I'm not sure why that's failing.
I will look at them during this weekend, I already had a look at build 
log from

the phone, but it's better to look from a PC.


I had a closer look.  The failing tests were in python2 only, coming 
from the non-ascii (Gërman http://Königsgäßchen.de/straße and Japanese 
http://ヒ:キ@ヒ.abc.ニ/ヒ?キ#ワ) unicode url tests. So from one perspective we 
don't need to worry so much about them, we could just disable them (e.g. 
prepend @onlyPy3 to test_parse_url and test_url_vulnerabilities in 
test_util.py). We'll be dropping python2 any way in the near future.


On the other hand, given the nature of the vulnerabilities and the 
possible uses of urllib3, it's probably best not to leave python2 
untested, especially since they are known to pass on python2 anyway in 
the right conditions.  Probably there is some package that should be 
added to Build-Depends to enable python2 tests to pass, though I have no 
idea which that package might be.


Drew



python-urllib3 1.25.6 uploaded to experimental (closes CVE-2019-11236) but fails build tests

2019-10-12 Thread Drew Parsons
Hi Daniele, just letting you know I uploaded python-urllib3 1.25.6 to 
experimental.


I was having some SSL trouble connecting to https://pub.orcid.org.  The 
error trace cited urllib3/contrib/pyopenssl.py, so I downloaded and 
installed python-urllib3 1.25.6 to see if updates to default SSL/TLS 
versions made any difference.  It didn't fix my problem, but since I had 
the package update ready I figured I might as well present it to 
experimental.


The new version fixes CVE-2019-11236 (Bug#927172).  As far as I can tell 
it also fixes CVE-2019-11324 (Bug#927412), but I figured it's best to 
let you review that.


Th package build was successful on my system but gives build-time errors 
in chroot (on buildd).  I'm not sure why that's failing.


Drew




Re: dh-python does not stop if there is error during the byte-compiling

2019-10-07 Thread Drew Parsons

Frederik wrote:


Hello, in one of my package (pymca), there is a syntax error like this.

byte-compiling 
/builds/science-team/pymca/debian/output/pymca-5.5.2+dfsg/debian/python-pymca5/usr/lib/python2.7/dist-packages/PyMca5/Object3D/Object3DPlugins/ChimeraStack.py 
to ChimeraStack.pyc
  File 
"/usr/lib/python2.7/dist-packages/PyMca5/Object3D/Object3DPlugins/ChimeraStack.py", 
line 72

with h5py.File(filename, mode='r') as f
  ^
SyntaxError: invalid syntax

(missing ':' at the end of the line) but this does not stop the build 
process.


Is it normal ?




The error got through both dh_auto_install and pybuild --install.  I'm 
pretty sure dh_auto_install will halt on error (or more correctly 
debian/rules will halt when dh_auto_install returns an error).  I would 
have thought pybuild would also catch the error and pass it on to 
dh_auto_install.  Could it be that upstream setup.py returns 0 when the 
SyntaxError happens or something similar, i.e. ignores the error? The 
problem must be either there or in pybuild.


Drew



Re: should Debian add itself to https://python3statement.org ?

2019-09-14 Thread Drew Parsons

On 2019-09-12 22:46, Drew Parsons wrote:

https://python3statement.org/ is a site documenting the projects which
are supporting the policy of dropping Python2 to keep Python3 only.

The site is designed for python packages specifically, to have only
Python3 supported by end of 2020.

But it seems to me it would be in the spirit of the site to add
Debian's pledge to remove Python2 (we are currently in the middle of
doing just that).

Is this a thing that we want to do as a project, to add Debian to
https://python3statement.org/ ?



Thanks all for the discussion.

Looks like we don't have consensus for listing on python3statement.org.  
As a whole-system distribution, we're running on a different timeframe 
to the individual python packages.


But in any case, the process of removing python2 packages from Debian is 
underway. By end of 2020 we might be able to judge whether it will be a 
Release Target for bullseye.


Drew



Re: skimage (build-)depends on python-cloudpickle which has already been dropped

2019-09-13 Thread Drew Parsons

On 2019-09-13 15:42, Andreas Tille wrote:

Hi,

as Peter Green found out the cloudpickle source package droped Python2
support which makes bug #938494 serious.  I guess it would be even
harder to drop python-skimage right now since this would affect
python-numpy and according to

   $ apt-cache rdepends python-numpy | sort | uniq | wc -l

339 reverse dependencies.  I have no better solution than just pinging
bug #938494 from time to time - and to ping this list to work harder on
Python3 migration.



It is something of a problem, so I'm cc:ing debian-python.

There have been a number of packages dropping python2 support 
out-of-sequence, causing this problem.  Not just cloudpickle, also 
pyqtgraph, which still had pymca as a dependent.  Fortunately pymca was 
already ready to drop the dependency, but still, how many other examples 
are there?


Python maintainers, remember, check your reverse dependencies before 
dropping your python2 packages.

Check each of

  build-rdeps python-yourmodule
  apt-rdepends -r python-yourmodule

and confirm the package has rdeps=0 on Sandro's list at 
http://sandrotosi.me/debian/py2removal/index.html


Drew



should Debian add itself to https://python3statement.org ?

2019-09-12 Thread Drew Parsons
https://python3statement.org/ is a site documenting the projects which 
are supporting the policy of dropping Python2 to keep Python3 only.


The site is designed for python packages specifically, to have only 
Python3 supported by end of 2020.


But it seems to me it would be in the spirit of the site to add Debian's 
pledge to remove Python2 (we are currently in the middle of doing just 
that).


Is this a thing that we want to do as a project, to add Debian to 
https://python3statement.org/ ?


Drew



numpy updates

2019-09-06 Thread Drew Parsons
Hi Sandro, I've pushed a numpy patch that should allow pymca tests to 
pass again (bugs #935454, #933056). Once pymca is running successfully 
again, we'll be able to remove its python2.


I recommend uploading numpy 1:1.16.2-2 to help that along (probably 
safer to patch 1.16 to fix these bugs before updating to 1.17 I reckon).


Drew



Re: Webpage to track py2removal bugs & packages

2019-09-03 Thread Drew Parsons

On 2019-09-03 11:56, Sandro Tosi wrote:

Within a given rdeps count it currently has secondary sorting made on
Bug No. It would polish off the forward deps if they could be used for
secondary sorting instead (highest number to lowest).  Bonus points 
for

making the headers clickable so the reader can choose which secondary
sorting is most useful (Bug No. vs Binary Pkg vs Maintainer vs # deps)


with the latest update the table is initially sorted by (# rdeps, #
fdeps); i've also made the header clickable, so that you can sort the
table on that column, but it doesnt preserve sorting between clicks
(ie if you click columnA and then columnB, the sorting on cB will
loose entirely the sorting that previously was set on cA, so you cant
click on # fdeps and then #rdeps and expect to return to he initial
sorting).


Looks good to me. 2-state sorting is plenty good, I think.



since i've already all the information there, i was thinking if we
should start posting a status of the rdepends in each bug, would it be
worth the effort? i'm not sure i want to spam too frequently the BTS,
so what would be an acceptable frequency: once a week?  if we want to
pile on this, all the blocks/blockedby bts massaging could be done by
this script i guess?



I'm less certain about this.  Once a week would be too often I think.  
We're at the point now where we know the python2 decommissioning needs 
to be done. You've given us the list of packages, we just need to go 
start working on them.


Drew



Re: Webpage to track py2removal bugs & packages

2019-09-02 Thread Drew Parsons

On 2019-09-02 12:21, Sandro Tosi wrote:
On Sun, Sep 1, 2019 at 9:54 PM Drew Parsons  
wrote:

Yes, your script counts the dependencies along one direction (rdeps),
identifying which packages are ready to be de-python2-ised next.

I'm talking about dependencies in the opposite direction, deps not
rdeps.  Upstream vs downstream.

My question is, of the 844 packages now currently on rdeps=0 and ready
for processing, which one should be processed first?  Which one will
free up the largest number of upstream packages?  Which one gives the
biggest bang for buck?


gothca, i hope.

I've added the number of forward dependencies in the last update to
the webpage; it's not super-accurate (f.e. python dep is counted
twice, due to the nature of how it's produced, > 2.7 < 2.8) but it
should give a general idea of what you asked for


Looks great, is plenty enough accurate for the task. It tells us it's 
more important to process live-task-standard than python-gnatpython-doc.


Within a given rdeps count it currently has secondary sorting made on 
Bug No. It would polish off the forward deps if they could be used for 
secondary sorting instead (highest number to lowest).  Bonus points for 
making the headers clickable so the reader can choose which secondary 
sorting is most useful (Bug No. vs Binary Pkg vs Maintainer vs # deps)


Drew



Re: Webpage to track py2removal bugs & packages

2019-09-01 Thread Drew Parsons

On 2019-09-02, Sandro Tosi wrote:

On 2019-09-02 01:15, Drew Parsons wrote:

Sandro Tosi wrote:

i've prepared a small website,
http://sandrotosi.me/debian/py2removal/index.html, to keep track of
the bugs user-tagged `py2removal`.



It could be useful to add another column counting the Dependencies in
the other direction.

i.e. for all the leaf packages ready for processing, which one should
be given priority?  Which one has the most impact on Dependencies
further down the tree?


I'm afraid i dont understand the question, or maybe it's just a
different process than the one i use.

the main idea (behind the webpage) is that you cant remove a package
unless it doesnt have any reverse-dependency: so if the count it's 0,
you can go ahead and remove it.

if it has rdeps, then you need to address them _first-, and that's
where the rdeps graph comes handy, as it tells you the actual
packages.



Yes, your script counts the dependencies along one direction (rdeps), 
identifying which packages are ready to be de-python2-ised next.


I'm talking about dependencies in the opposite direction, deps not 
rdeps.  Upstream vs downstream.


My question is, of the 844 packages now currently on rdeps=0 and ready 
for processing, which one should be processed first?  Which one will 
free up the largest number of upstream packages?  Which one gives the 
biggest bang for buck?


Drew

--
. please CC: me in replies



Re: Webpage to track py2removal bugs & packages

2019-09-01 Thread Drew Parsons

Sandro Tosi wrote:

i've prepared a small website,
http://sandrotosi.me/debian/py2removal/index.html, to keep track of
the bugs user-tagged `py2removal`.



It could be useful to add another column counting the Dependencies in 
the other direction.


i.e. for all the leaf packages ready for processing, which one should be 
given priority?  Which one has the most impact on Dependencies further 
down the tree?


Drew



Bug#939134: ITP: python-pypathlib -- Polygon package for Python

2019-09-01 Thread Drew Parsons
Package: wnpp
Severity: wishlist
Owner: Drew Parsons 

* Package name: python-pypathlib
  Version : 0.1.2
  Upstream Author : Nico Schlömer 
* URL : https://github.com/nschloe/pypathlib
* License : MIT
  Programming Lang: Python
  Description : Polygon package for Python

Lightweight package for working with 2D paths/polygons.

pypathlib is fully vectorized, so it's pretty fast. (Not quite as fast
as mathplotlib.path.contains_points though.)

pypathlib is required by dmsh (python-dmsh), a python module capable
of generating 2D meshes.

To be packaged under the Debian Science team alongside other related
packages by the same author: meshio (mesh file conversion), pygalmesh
(3D meshes), and dmsh (2D meshes)


Bug#935908: ITP: dmsh -- simple 2D mesh generator inspired by distmesh

2019-08-27 Thread Drew Parsons
Package: wnpp
Severity: wishlist
Owner: Drew Parsons 

* Package name: python-dmsh
  Version : 0.1.3
  Upstream Author : Nico Schlömer 
* URL : https://github.com/nschloe/dmsh
* License : MIT
  Programming Lang: Python
  Description : simple mesh generator inspired by distmesh

 dmsh: "The worst mesh generator you'll ever use."
 
 Inspired by distmesh, dmsh is slow, requires a lot of memory, and
 isn't terribly robust either.

 On the plus side, it's got a usable interface, is pure Python (and
 hence easily installable on any system), and if it works, it produces
 pretty high-quality meshes.

 Combined with optimesh, dmsh produces the highest-quality 2D meshes
 in the west.

 Example capabilities:
 * Primitives
   - circle, rectangle, polygon
   - halfspace
 * Combinations
   - difference
   - nonconstant edge length
   - union
   - intersection
 * Transformations
   - rotation, translation, scaling
 * Local refinement


A simple-to-use tool for creating 2D meshes. Complements mshr
(which is not actively developed)

To be packaged under the Debian Science team alongside other related
packages by the same author: meshio (mesh file conversion), pygalmesh
(3D meshes)

Some debate about source package name: dmsh? python-dmsh? python3-dmsh?
A quick poll on irc indicates some preference for python-dmsh. Further
debate welcome.


Re: increased build time and disk usage for scipy with pybuild

2019-07-28 Thread Drew Parsons

On 2019-07-27 22:04, Drew Parsons wrote:

I've uploaded python-scipy 1.2.2 to unstable.

Previously the build system ran through distutils, but dh now gives an
error on that, and says pybuild should be used instead.  So I
reorganised debian/rules to use dh --buildsystem=pybuild.  I also
reorganised rules to stop the second invocation of dh in the
build-arch rule.

The build completes successfully.  But it is taking 3 times longer
than it did before.  It is also using around 75% more disk space than
before.

...

Is an increase in build resources like this known to happen when
pybuild is used instead of distutils?



This seems to be why scipy 1.2.2-1 used more build resources:
scipy modules are built in 2 steps:
1) setup.py config_fc --noarch build (with override_dh_auto_configure)
2) setup.py install ... --force --no-compile (with 
override_dh_auto_install)


Previously (building with distutils), both steps built via a 
build/src.linux-x86_64-* builddir, so the --no-compile flag meant object 
files were not compiled twice.


Now with pybuild, the config_fc step is still built in 
build/src.linux-x86_64-*, but the install step is handled via 
build/src.linux-amd64-*.  So object files are compiled twice, first for 
x86_64, and then for amd64.  Likewise arm64 first "configures" builds in 
an aarch64 builddir, and then "installs" via arm64.


Looks like some discrepancy between setup.py config_fc and pybuild's 
setup.py install in handling DEB_*_ARCH_CPU and DEB_*_GNU_CPU.


DEB_*_CPU is not used explicitly by scipy's debian/rules.

Is setup.py config_fc even needed?  Is it a hangover from distutils and 
not needed in a pybuild build?  It's not well documented and not listed 
by "python3 setup.py --help-commands", though is mentioned in 
INSTALL.rst.txt (with a brief reference in scipy/special/setup.py and 
scipy/integrate/setup.py). But debian/rules doesn't use it with 
--fcompiler to specific the fortran compiler.


Drew



increased build time and disk usage for scipy with pybuild

2019-07-27 Thread Drew Parsons

I've uploaded python-scipy 1.2.2 to unstable.

Previously the build system ran through distutils, but dh now gives an 
error on that, and says pybuild should be used instead.  So I 
reorganised debian/rules to use dh --buildsystem=pybuild.  I also 
reorganised rules to stop the second invocation of dh in the build-arch 
rule.


The build completes successfully.  But it is taking 3 times longer than 
it did before.  It is also using around 75% more disk space than before.


I guess the increase is probably not due to the new 1.2.2 release 
itself, otherwise I'd have expected an increase also in 1.2.1 (in 
experimental) compared to 1.1.0, which didn't happen.  I figure it's 
probably not simply that the buildds are busy catching up after the 
buster release, since I'm not seeing an increase in build time of other 
packages.  pybuild seems a more likely culprit (if not some other aspect 
of debhelper compatibility level 12. Previously debhelper 9 was used).


Is an increase in build resources like this known to happen when pybuild 
is used instead of distutils?



Drew



Re: Transition tracker for rm python2 live

2019-07-23 Thread Drew Parsons

Scott K wrote:

On 2019-07-24 09:01, eamanu15 . wrote:

El mar., 23 de jul. de 2019 a la(s) 21:55, Drew Parsons escribió:


What should "success" or completion look like on the tracker?   I
uploaded pyfttw (not python-fftw, which is a different package)
hoping
to get the first line of good green, but it has gone yellow rather
than
green.


mmm maybe yellow is ok?

Once the python2 binaries are decrufted it should disappear off the 
tracker entirely.


I see.  The yellow means the package has been updated to python3 only, 
but the old python2 cruft is still sitting there.


Drew




Re: Transition tracker for rm python2 live

2019-07-23 Thread Drew Parsons

Scott Kitterman wrote:

See https://release.debian.org/transitions/html/python2-rm.html

The ones near the top of the stack (bottom of the page) are less likely 
to
have rdepends that need to be sorted before action can be taken on 
them.



What should "success" or completion look like on the tracker?   I 
uploaded pyfttw (not python-fftw, which is a different package) hoping 
to get the first line of good green, but it has gone yellow rather than 
green.


Drew



Re: dropping python2 [was Re: scientific python stack transitions]

2019-07-07 Thread Drew Parsons

On 2019-07-07 23:31, Matthias Klose wrote:

On 07.07.19 16:55, Drew Parsons wrote:

On 2019-07-07 22:46, Mo Zhou wrote:

Hi science team,

By the way, when do we start dropping python2 support?
The upstreams of the whole python scientific computing
stack had already started dropping it.


Good question.  I think it is on the agenda this cycle, but 
debian-python will

have the call on it.


you can start dropping it now, however please don't drop anything yet 
with

reverse dependencies.  So leaf packages first.


That a sensible order.  It puts packages on the same footing as new 
packages, which are already forbidden by lintian from building python2 
packages.


Drew



dropping python2 [was Re: scientific python stack transitions]

2019-07-07 Thread Drew Parsons

On 2019-07-07 22:46, Mo Zhou wrote:

Hi science team,

By the way, when do we start dropping python2 support?
The upstreams of the whole python scientific computing
stack had already started dropping it.


Good question.  I think it is on the agenda this cycle, but 
debian-python will have the call on it.


Drew



Re: overriding default compile flags when building python extensions

2019-05-03 Thread Drew Parsons

Ansgar wrote:

On Fri, 2019-05-03 at 19:49 +0800, Drew Parsons wrote:
> The first -g is the problem (in "-DNDEBUG -g -fwrapv -O2 -Wall").

Not sure about setup.py, but `-g0` (after `-g`) should also disable
debug information:

+---
| Level 0 produces no debug information at all.  Thus, -g0 negates -g.
+---



Thanks Ansgar. Weird (dangerous, even) if the flags can't be controlled 
directly.

But I can try the -g0 trick and see if it solves the problem.

Drew



overriding default compile flags when building python extensions

2019-05-03 Thread Drew Parsons

Dear Debian Python community,

how does one control or override the default flags used by setup.py 
(setuptools module) when compiling a C extension for a python module?


The intention is to switch off -g in pygalmesh, because of memory 
constraints on 32 bit systems when compiling against CGAL, see 
Bug#928140, https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=928140


A failed build, e.g. i386 at 
https://buildd.debian.org/status/fetch.php?pkg=pygalmesh=i386=0.3.1-1=1556128605=0
shows -g is used twice.  The second usage comes from CFLAGS in 
dpkg-buildflags, which I can control.


The first -g is the problem (in "-DNDEBUG -g -fwrapv -O2 -Wall").  The 
advice from the internet (stackoverflow) is exceedingly poor on this 
question, in most cases addressing how to add flags, not how to change 
the existing default flags. Alternatives like

  OPT="" python3 ./setup.py
simply do not work.

The default in question appears to be the OPT variable defined in 
/usr/lib/python3.7/config-3.7m-x86_64-linux-gnu/Makefile


But how to override it?

Upstream (Nico) suggests instead to build with CC=clang++.  Is this a 
good alternative for Debian python modules?


Drew



Re: scipy 1.2.0 and joining DPMT

2019-03-16 Thread Drew Parsons

On 2019-03-17 02:48, Drew Parsons wrote:


Hi Dmitry, the Tools section still refers to git-dpm.


Ah, that would be MR-5, still in discussion.



Re: scipy 1.2.0 and joining DPMT

2019-03-16 Thread Drew Parsons

On 2019-03-17 02:39, Dmitry Shachnev wrote:

On Fri, Mar 15, 2019 at 02:34:26PM +0100, Ondrej Novy wrote:

> https://salsa.debian.org/python-team/tools/python-modules/merge_requests/4

let's merge it without "Configurations" section now, please.


Done!

I will now also review the follow-up merge requests.



Hi Dmitry, the Tools section still refers to git-dpm.

Drew.



Re: unblock request: python-scipy/1.1.0-4 skimage/0.14.2-2: autopkgtest passes (Re: bug#919929)

2019-03-16 Thread Drew Parsons

On 2019-03-16 22:07, Paul Gevers wrote:

On 16-03-2019 13:48, Drew Parsons wrote:


Is there enough will to add more scipy patches for the buster release 
to

reduce the remaining DeprecationWarnings? (they don't break tests,
they're just annoying)
Or should we just let it go at this point and let them get cleared in
future versions?)


I'd let it be for now.


No problem, will do (more precisely, not do).



That being the case, in the interests of making a stable release that
passes it own tests, I'd like to request an unblock for
python-scipy/1.1.0-4 (together with skimage/0.14.2-2)


skimage was already unblocked. I'll unblock python-scipy as well.



Thanks Paul.  Looks like we'll be able to close Bug#919929 once 
python-scipy/1.1.0-4 is settled into testing.


Drew



unblock request: python-scipy/1.1.0-4 skimage/0.14.2-2: autopkgtest passes (Re: bug#919929)

2019-03-16 Thread Drew Parsons

On 2019-03-11 14:39, Drew Parsons wrote:


I've adapted the 3 patches and pushed to salsa,
   matrix_API_614847c5.patch
   matrix_API_more_e0cfa29e2.patch
   matrix_API_filter_check_87e48c3c5.patch
https://salsa.debian.org/python-team/modules/python-scipy/tree/master/debian/patches

...

The numpy.sparse tests pass with this patch, and most of the matrix
PendingDeprecationWarnings are gone (the upstream patch missed
integrate/tests/test_ivp.py, but the remaining warnings are few enough
to not need to worry about).


Well, turns out the other warnings worried Aurelien enough to file 
Bug#924396.


Is there enough will to add more scipy patches for the buster release to 
reduce the remaining DeprecationWarnings? (they don't break tests, 
they're just annoying)
Or should we just let it go at this point and let them get cleared in 
future versions?)


...


With these patches, the sparse matrix tests pass. There remain three
errors unrelated to sparse matrix:
  spatial/tests/test__plotutils.py::TestPlotting::test_delaunay FAILED
[ 76%]
  spatial/tests/test__plotutils.py::TestPlotting::test_voronoi FAILED
[ 76%]
  spatial/tests/test__plotutils.py::TestPlotting::test_convex_hull
FAILED  [ 76%]

,,,

  >   with suppress_warnings as sup:
  E   AttributeError: __enter__

Apart from that, I'm happy to upload the sparse matrix patches once
the s390x update reaches testing.



Those errors must have been local to me.  scipy 1.1.0-4 now passes debci 
tests cleanly.


That being the case, in the interests of making a stable release that 
passes it own tests, I'd like to request an unblock for 
python-scipy/1.1.0-4 (together with skimage/0.14.2-2)


Drew



Re: scipy 1.2.0 and joining DPMT

2019-03-16 Thread Drew Parsons

On 2019-03-15 21:15, Ondrej Novy wrote:

Hi,


Thanks Ondrej, read and accepted.


welcome :)


Thanks :) I fixed scipy's test failures :)



The policy on Maintainer/Upload fields is interesting.  I've
installed git-dpm, likely it will be useful for my other packages.


please don't use git-dpm in DPMT :)


Yeah,  I got the update :)  quilt / dpkg-source --commit will do.

Drew



Re: python-scipy: autopkgtest fails (Re: bug#919929)

2019-03-11 Thread Drew Parsons

On 2019-03-10 15:46, Drew Parsons wrote:

On 2019-03-10 03:51, Paul Gevers wrote:

Hi Drew,




To remove the deprecation warnings we'd need to deal with them at the
source. Upstream has patches
https://github.com/scipy/scipy/commit/614847c5fc8d5f8a618980df3c1b93540428ae46

https://github.com/scipy/scipy/commit/e0cfa29e2fbe86f994924c0c7514ff5bbfffd514

and for completeness
https://github.com/scipy/scipy/commit/87e48c3c54d7a85bc6628c88c1de98ac0469b6fa

...

Can you authorise an unblock to apply these 3 upstream patches to
python-scipy 1.1.0 ?
That won't necessarily fix the debci failure, but it will make it a 
lot

easier to see what's actually failing.


I am slightly unhappy with the second patch, as it seems to be doing
more than you describe above in a few places. This may be correct but
that is difficult to quickly judge.


I've adapted the 3 patches and pushed to salsa,
   matrix_API_614847c5.patch
   matrix_API_more_e0cfa29e2.patch
   matrix_API_filter_check_87e48c3c5.patch
https://salsa.debian.org/python-team/modules/python-scipy/tree/master/debian/patches

The other behaviour that you saw in the second patch I think might be 
replacement of "D*diag(v)" with "D@diag(v)". That's matrix 
multiplication, so still relevent to the matrix API patch. @ is not 
available in python2 so I changed it to numpy.matmul, which does the 
same thing.


The numpy.sparse tests pass with this patch, and most of the matrix 
PendingDeprecationWarnings are gone (the upstream patch missed 
integrate/tests/test_ivp.py, but the remaining warnings are few enough 
to not need to worry about).




Also, what is the general documented way that these wrappers can be
used? scipy is sort of taking over the maintenanceship of these
functions in this way if we're not careful.



...

There is discussion of the distinction between numpy.matrix and
numpy.ndarray (which is at the heart of the deeprecation warning) at
https://docs.scipy.org/doc/scipy/reference/tutorial/linalg.html#numpy-matrix-vs-2d-numpy-ndarray

The utility class scipy.sparse.sputils itself is apparently
undocumented, by which I infer it's intended for internal use only,
not a public API. I guess it's reasonable for a package to be testing
it's own internal functions.  Strange thing is,
scipy.sparse.sputils.matrix is not actually defined in
scipy/sparse/sputils.py. Must be inherited or defined in some deep
pythonfu that I haven't mastered yet.



Actually scipy.sparse.sputils.matrix was defined in these patches.  It 
is a bit odd that upstream is wrapping numpy.matrix just to avoid 
deprecation warnings, rather than following their own advice and using 
numpy.ndarray instead.



With these patches, the sparse matrix tests pass. There remain three 
errors unrelated to sparse matrix:
  spatial/tests/test__plotutils.py::TestPlotting::test_delaunay FAILED   
  [ 76%]
  spatial/tests/test__plotutils.py::TestPlotting::test_voronoi FAILED
  [ 76%]
  spatial/tests/test__plotutils.py::TestPlotting::test_convex_hull 
FAILED  [ 76%]
  __ TestPlotting.test_delaunay 
__


  self = 0x7f0f31156eb8>


  def test_delaunay(self):
  # Smoke test
  fig = plt.figure()
  obj = Delaunay(self.points)
  s_before = obj.simplices.copy()
  >   with suppress_warnings as sup:
  E   AttributeError: __enter__

  fig= 
  obj= 
  s_before   = array([[3, 1, 0],
 [2, 3, 0]], dtype=int32)
  self   = at 0x7f0f31156eb8>


  
/usr/lib/python3/dist-packages/scipy/spatial/tests/test__plotutils.py:31: 
AttributeError


(likewise for the other 2)
These AttributeErrors are mentioned at 
https://github.com/scipy/scipy/issues/9491. Upstream doesn't seem too 
concerned about them.


Apart from that, I'm happy to upload the sparse matrix patches once the 
s390x update reaches testing.


Drew



Re: python-scipy: autopkgtest fails (Re: bug#919929)

2019-03-09 Thread Drew Parsons

On 2019-03-10 03:51, Paul Gevers wrote:

Hi Drew,

On 08-03-2019 03:08, Drew Parsons wrote:

On 2019-03-07 20:46, Paul Gevers wrote:
If you upload now, your package will not migrate to testing before 
the
full freeze becomes effective so it would need an unblock. If you 
want
to fix this issue with the three lines I saw in the bug report, you 
can
go ahead. However, it is probably worth waiting for a resolution of 
bug

915738 and combine it with that.



Alas, the deprecation patch (in python-scipy 1.1.0-3) doesn't actually
prevent emission of the deprecation warnings, so they're still 
spamming

the debci log.


Do you have evidence they did anything at all? If not, please revert
this if we get to a new upload.


It would seem it did not help.  In any case, the upstream patches 
supercede this patch, so it will be removed naturally.



To remove the deprecation warnings we'd need to deal with them at the
source. Upstream has patches
https://github.com/scipy/scipy/commit/614847c5fc8d5f8a618980df3c1b93540428ae46

https://github.com/scipy/scipy/commit/e0cfa29e2fbe86f994924c0c7514ff5bbfffd514

and for completeness
https://github.com/scipy/scipy/commit/87e48c3c54d7a85bc6628c88c1de98ac0469b6fa


The deprecation problem (matrix API) appears in many places, but the 
fix

is straightfoward: replace np.matrix with matrix from from
scipy.sparse.sputils

Can you authorise an unblock to apply these 3 upstream patches to
python-scipy 1.1.0 ?
That won't necessarily fix the debci failure, but it will make it a 
lot

easier to see what's actually failing.


I am slightly unhappy with the second patch, as it seems to be doing
more than you describe above in a few places. This may be correct but
that is difficult to quickly judge.


The patches as they are don't apply cleanly to the 1.1.0 source, so I'll 
need to adapt them anyway.  I can retain only the ones relevant to 
updating the matrix API.




Also, what is the general documented way that these wrappers can be
used? scipy is sort of taking over the maintenanceship of these
functions in this way if we're not careful.



It's a good question that the other scipy maintainers might have thought 
more about.  As far as I can tell, the scipy tests affected here involve 
sparse matrices.  The trouble arises from an "inadequacy" in the core 
numpy API, with numpy.matrix only being suitable for dense matrices.  
scipy could be described as "numpy+algorithms", with additional 
algorithms required to handle sparse matrices, provided in 
scipy.sparse.sputils.matrix.


numpy.matrix is documented at 
https://docs.scipy.org/doc/numpy/reference/generated/numpy.matrix.html


The scipy sparse matrix API is at 
https://docs.scipy.org/doc/scipy/reference/sparse.html, but that's 
specifically for scipy.sparse.spmatrix


There is discussion of the distinction between numpy.matrix and 
numpy.ndarray (which is at the heart of the deeprecation warning) at 
https://docs.scipy.org/doc/scipy/reference/tutorial/linalg.html#numpy-matrix-vs-2d-numpy-ndarray


The utility class scipy.sparse.sputils itself is apparently 
undocumented, by which I infer it's intended for internal use only, not 
a public API. I guess it's reasonable for a package to be testing it's 
own internal functions.  Strange thing is, scipy.sparse.sputils.matrix 
is not actually defined in scipy/sparse/sputils.py. Must be inherited or 
defined in some deep pythonfu that I haven't mastered yet.


I'll check that I can adapt those upstream patches to cleanly remove 
these deprecation warnings.


Drew



Re: python-scipy: autopkgtest fails (Re: bug#919929)

2019-03-07 Thread Drew Parsons

On 2019-03-07 20:46, Paul Gevers wrote:


If you upload now, your package will not migrate to testing before the
full freeze becomes effective so it would need an unblock. If you want
to fix this issue with the three lines I saw in the bug report, you can
go ahead. However, it is probably worth waiting for a resolution of bug
915738 and combine it with that.



Alas, the deprecation patch (in python-scipy 1.1.0-3) doesn't actually 
prevent emission of the deprecation warnings, so they're still spamming 
the debci log.  On the bright side, s390x is now using gfortran-8 
successfully (#915738).


To remove the deprecation warnings we'd need to deal with them at the 
source. Upstream has patches

https://github.com/scipy/scipy/commit/614847c5fc8d5f8a618980df3c1b93540428ae46
https://github.com/scipy/scipy/commit/e0cfa29e2fbe86f994924c0c7514ff5bbfffd514
and for completeness
https://github.com/scipy/scipy/commit/87e48c3c54d7a85bc6628c88c1de98ac0469b6fa

The deprecation problem (matrix API) appears in many places, but the fix 
is straightfoward: replace np.matrix with matrix from from 
scipy.sparse.sputils


Can you authorise an unblock to apply these 3 upstream patches to 
python-scipy 1.1.0 ?
That won't necessarily fix the debci failure, but it will make it a lot 
easier to see what's actually failing.


Drew



Re: python-scipy: autopkgtest fails (Re: bug#919929)

2019-03-07 Thread Drew Parsons

On 2019-03-07 22:07, Paul Gevers wrote:

Hi Drew,

On 07-03-2019 14:56, Drew Parsons wrote:

On 2019-03-07 20:46, Paul Gevers wrote:

However, it is probably worth waiting for a resolution of bug
915738 and combine it with that.


There hasn't been recent movement on 915738. I'll apply Julian's patch
and see how we go.


Huh. There was a comment from doko that the underlying issue is fixed 
in

gcc-8, no? I think you only need to switch to the default gfortran. On
the other hand, I don't know how feasible it is at this moment to not
release buster with gcc-7. Maybe other release team members can comment
on that?


Perhaps it's ok now, as Matthias says.  Testing on zelenka now. I'll do 
the 919929 upload if s390x proves fine.


Drew



Re: python-scipy: autopkgtest fails (Re: bug#919929)

2019-03-07 Thread Drew Parsons

On 2019-03-07 20:46, Paul Gevers wrote:

Hi Drew,

On 07-03-2019 13:19, Drew Parsons wrote:


python-scipy is currently failing all debci tests in both unstable and
testing.


autopkgtest only, so no FTBFS?


That's right, scipy builds fine.


Some of us want failing autopkgtest to be RC *after* the release of
buster. I am not aware of consensus about that yet. autopkgtest
*regression* in testing is effectively RC since the soft freeze of
12-2-2019. The autopkgtest of python-scipy is already failing in
testing, so it isn't a regression. Hence, it failing is not RC for 
buster.


Thanks, that clarifies the appropriate severity.

Certainly please do unblock if the full freeze is already in place. 
But

my intention was to first upload python-scipy 1.1 with a small patch,
not 1.2 just yet.


If you upload now, your package will not migrate to testing before the
full freeze becomes effective so it would need an unblock. If you want
to fix this issue with the three lines I saw in the bug report, you can
go ahead. However, it is probably worth waiting for a resolution of bug
915738 and combine it with that.


There hasn't been recent movement on 915738. I'll apply Julian's patch 
and see how we go.


Drew



Re: python-scipy: autopkgtest fails (Re: bug#919929)

2019-03-07 Thread Drew Parsons

On 2019-03-07 19:00, Paul Gevers wrote:

Hi Drew,

On 07-03-2019 09:03, Andreas Tille wrote:

On Tue, Mar 05, 2019 at 07:01:54PM +0800, Drew Parsons wrote:
python-scipy has recently started failing all debci tests in testing 
and

unstable, exacerbating the bug report in Bug#919929 [1].

The failing error is a MemoryError. But understanding the problem is
hampered by a flood of deprecation warnings, presumably triggered by 
numpy
1.16. scipy 1.2 added instructions to pytest.ini to ignore the 
warnings.


Bug#919929 has not yet been marked RC but I guess it's about to 
happen.


Can you elaborate why you think that bug should be RC (as that isn't
clear to me from the report itself) and why you haven't marked it as
such if you think it should be?


python-scipy is currently failing all debci tests in both unstable and 
testing.


I haven't marked it RC myself since I'm not 100% certain what the usual 
protocol is for marking the severity of debci test failures.  But as I 
understood it, debci test failures is considered RC under the final 
freeze which we're about to enter (but we're not quite in that deep 
freeze yet).




I
propose in the first instance to apply a patch of the pytest.ini diff
between scipy 1.1 and 1.2 and see if that clears things up. I'll 
commit the

patch now. Should I proceed with upload to unstable?


May be its sensible to coordinate with debian-release (list in CC) and
file a unblock request *before* uploading.  I confirm that I usually
upload simple fixes and ask for unblock afterwards but you intend to 
do

a version change which does not qualify as simple fix (despite I agree
with you that it is sensible).


Hi Andreas, I may not have been clear.  What I mean at this point is to 
upload a small patch for scipy 1.1 to ignore the deprecation warnings 
(scipy 1.2 already does that).


If that doesn't help pass the debci tests then we can consider uploading 
scipy 1.2 instead.  But python-fluids and dipy FTBFS against scipy 1.2 
(a new version of dipy is available which presumeably fixes that)



Slightly depending on the answer above, I'll unblock an upload of
python-scipy with only that change.


Certainly please do unblock if the full freeze is already in place. But 
my intention was to first upload python-scipy 1.1 with a small patch, 
not 1.2 just yet.


Drew



python-scipy: autopkgtest fails (Re:bug#919929)

2019-03-05 Thread Drew Parsons
python-scipy has recently started failing all debci tests in testing and 
unstable, exacerbating the bug report in Bug#919929 [1].


The failing error is a MemoryError. But understanding the problem is 
hampered by a flood of deprecation warnings, presumably triggered by 
numpy 1.16. scipy 1.2 added instructions to pytest.ini to ignore the 
warnings.


Bug#919929 has not yet been marked RC but I guess it's about to happen.  
I propose in the first instance to apply a patch of the pytest.ini diff 
between scipy 1.1 and 1.2 and see if that clears things up. I'll commit 
the patch now. Should I proceed with upload to unstable?


Drew

[1] https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=919929



Re: scipy 1.2.0 and joining DPMT

2019-02-25 Thread Drew Parsons

On 2019-01-29 10:36, Dmitry Shachnev wrote:


   On Mon, Jan 28, 2019 at 11:13:46AM +0100, webm...@emerall.com wrote:


On Monday, January 28, 2019 10:35 CET, Ondrej Novy  
wrote:

to join DPMT, you need to read and accept our policy:
https://salsa.debian.org/python-team/tools/python-modules/blob/master/policy.rst


   Thanks Ondrej, read and accepted.
   The policy on Maintainer/Upload fields is interesting.

...
By the way I apologise for my webmail client showing my email plumbing 
(it should have replied as dpars...@debian.org not 
webm...@emerall.com). My email provider >(gandi.net) recently changed 
its webmail interface, but the new software doesn't handle Reply-To 
addresses well. I've sent them a bug report.



On 2019-01-15 03:27, Drew Parsons wrote:

Hi Python team, now that numpy 1.16rc has reached testing, are there
plans to get scipy 1.2.0 into the coming release?

Will it help if I join DPMT?  I can then update scipy and upload to
experimental. Please add me on salsa if that will help.



Hi again Ondrej and Debian Python Team, my request to join DPMT still 
hasn't been processed. My salsa id is dpars...@debian.org
There's a new update to scipy which I can help get into experimental, 
then we can test dipy again.


Drew



Re: scipy 1.2.0 and joining DPMT

2019-01-29 Thread Drew Parsons

On 2019-01-29 10:36, Dmitry Shachnev wrote:

On Mon, Jan 28, 2019 at 11:13:46AM +0100, webm...@emerall.com wrote:

Thanks Ondrej, read and accepted.  
The policy on Maintainer/Upload fields is interesting.
I've installed git-dpm, likely it will be useful for my other 
packages.


It will not be useful in DPMT/PAPT. The policy is outdated and we have
switched to using gbp instead, as described here:

https://wiki.debian.org/Python/GitPackaging


Ah good, thanks for the update Dmitry.

By the way I apologise for my webmail client showing my email plumbing 
(it should have replied as dpars...@debian.org not webm...@emerall.com). 
 My email provider (gandi.net) recently changed its webmail interface, 
but the new software doesn't handle Reply-To addresses well.  I've sent 
them a bug report.


Drew



scipy 1.2.0 and joining DPMT

2019-01-14 Thread Drew Parsons
Hi Python team, now that numpy 1.16rc has reached testing, are there 
plans to get scipy 1.2.0 into the coming release?


Will it help if I join DPMT?  I can then update scipy and upload to 
experimental. Please add me on salsa if that will help.


Drew



Re: xhtml2pdf updated, but can't push to salsa

2018-02-22 Thread Drew Parsons
On Thu, 2018-02-22 at 12:11 +0100, Raphael Hertzog wrote:
> Hi,
> 
> On Thu, 22 Feb 2018, Ondrej Novy wrote:
> > 2018-02-22 11:18 GMT+01:00 Drew Parsons <dpars...@debian.org>:
> > > The python group on salsa does not have the button for joining
> > > the
> > > group (I don't actually want to join the group, but the commits
> > > to
> > > xhtml2pdf should be pushed).
> > 
> > I'm sorry, but you need to join the group to push commits into it.
> 
> I added Drew Parsons to the project directly (i.e. not the group)
> so that he can push his work.
> 

Thanks Raphael, pushed now.

Happy for you python guys to take over from here, I only needed
xhtml2pdf to get sasview building (and for this upload, I wanted to get
the proper release in place rather than a beta release).

Drew




xhtml2pdf updated, but can't push to salsa

2018-02-22 Thread Drew Parsons
I updated xhtml2pdf to 0.2.1.   But salsa is refusing to receive the
git push, "GitLab: You are not allowed to push code to this project.". 

The python group on salsa does not have the button for joining the
group (I don't actually want to join the group, but the commits to
xhtml2pdf should be pushed).

Drew