Re: RFS: python-click-default-group: Extension for Python click adding default subcommand to group

2021-09-29 Thread Ghislain Vaillant
Le mer. 29 sept. 2021 à 23:14, Dominik George  a
écrit :

>
> > and that will require an upstream new release, which does not help
> > when you want/need to package the current one
>
> Most upstreams kindly make . post releases immediately.
>

I found that to be pretty rare in my own experience.

>
Maybe I am just lucky with upstreams...
>

You are, indeed.


Re: Looking to help

2021-01-18 Thread Ghislain Vaillant
Le lundi 18 janvier 2021 à 23:34 -0800, Perry Aganad a écrit :
> Greetings everyone!,
> 
> I have been using debian for a while now and I am a point where I
> want 
> to help and start contributing to debian itself. I read the web page 
> about contributing and I took away that I should just jump right in,
> and 
> I specifically jumped here because I do know how write python
> scripts. I 
> am new at this but I'm a quick learner so please let me know if I can
> help out in anyway, thanks!.

You may consider joining the team [1] and start contributing towards
existing packages in need for maintenance [2], such as those with test
failures or RC bugs.

[1] https://wiki.debian.org/Teams/PythonTeam/HowToJoin
[2] https://tracker.debian.org/teams/python-modules/

Best regards,
Ghis



Re: 2to3 adds '.' in front dir of "from dir import ..." statements (Was: [MoM] lefse migration to python 3])

2019-09-10 Thread Ghislain Vaillant
It results from the ambiguity between absolute and relative imports in
Python 2.

Here 2to3 considers your imports being relative, hence the added dot. I
believe no dots would be added should a `from __future__ import
absolute_import` be found in the preamble of the module.

Hope this helps.

Ghis

Le mar. 10 sept. 2019 à 07:51, Andreas Tille  a écrit :

> Hi,
>
> in the process of the Python3 migration the package lefse was converted
> using 2to3.  The changes can be found in git[1].  I'm wondering about
> the following diff created by 2to3:
>
>   - from lefse import *
>   + from .lefse import *
>
> When calling a random binary of the resulting binary package lefse I
> experienced:
>
> $ plot_features
> Traceback (most recent call last):
>   File "/usr/bin/plot_features", line 6, in 
> from .lefse import *
> ModuleNotFoundError: No module named '__main__.lefse'; '__main__' is not a
> package
>
>
> I think the line
>
>from lefse import *
>
> should remain to keep that script functional.  I now checked another
> package (cain - nothing pushed yet) and here also 2to3 is changing
>
>from something import *
>
> to
>
>from .something import *
>
> Could somebody please enlighten me about this added '.' which does not
> seem to work?
>
> Kind regards
>
>   Andreas.
>
> [1] https://salsa.debian.org/med-team/lefse
>
> --
> http://fam-tille.de
>
>


Re: Best way to handle circular build deps to make a pypy- package

2018-08-15 Thread Ghislain Vaillant
On Wed, 15 Aug 2018, 12:11 Samuel Thibault,  wrote:

> Pierre-Elliott Bécue, le mer. 15 août 2018 13:05:09 +0200, a ecrit:
> >  2. What's the proper way to handle such packages?
>
> Build profiles? You can annotate the build-dep needed for check with
>  so that one can easily (re)bootstrap the circle at any time
> by using DEB_BUILD_OPTIONS=nocheck dpkg-buildpackage -Pnocheck
>
> Samuel
>

I second this. Build profiles are the way to go. You can find plenty of
examples in codesearch.debian.net.

Ghis

>


Re: Is there a tool to debianize for the first time a package from pypi

2018-08-03 Thread Ghislain Vaillant
Hi Fred,

You can probably use one of the packaging for one of the Spyder plugins as
the basis for src:spyder-kernels.

Otherwise, I can do it if you prefer?

Le ven. 3 août 2018 à 14:45, PICCA Frederic-Emmanuel <
frederic-emmanuel.pi...@synchrotron-soleil.fr> a écrit :

> Hello, I need to create a new package for
>
> spyder_kernels.
>
> Is there a tool which allows to create the first version of a package
> which create the debian/directory from the setup.py files ?
>
> thanks
>
> Frederic
>


Re: Several questions regarding package tifffile

2018-04-09 Thread Ghislain Vaillant
Le lun. 9 avr. 2018 à 08:46, Andreas Tille  a écrit :

> Hi,
>
> I realised that upstream of tifffile[1] switched to Python3.  When
> inspecting the package I somehow think that its main purpose is rather a
> user application than a Python module and thus I would rather rename the
> binary from python-tifffile to tifffile than python3-tifffile.  What do
> you think?
>
> As per [2] I added dh-python to the Build-Depends but it seems I would
> need python-minimal in addition to get /usr/bin/pyversions.  For me it
> is a bit strange to realise that this basic functionality is not provided
> by dh-python "Depends: python-minimal".  Am I missing something?
>
> It also seems that a Build-Depends: python3-distutils and if I add it it
> tries to do "python setup.py clean -a" which is stupid since there is no
> setup.py (and also no python but only python3).  I wonder whether I'm
> missing something here as well in this very simple package which does
> not seem to work well with dh-python.
>
> BTW, I'd be *really* happy if somebody from Python team would take over
> this package into Python packaging team.  I just adopted it from a
> former Debian Med team member but it is not Debian Med specific at all
> and I have no real interest in this package.
>

I could have a look at it. Do you reckon why the package was needed
originally? I am surprised it landed in d-med, so perhaps there is another
package needing it?


> Kind regards
>
>Andreas.
>
> PS: Please CC me, I'm not subscribed to this list.
>
> [1] https://salsa.debian.org/med-team/tifffile
> [2] https://people.debian.org/~piotr/dh_python2_without_dh-python.ddlist
>
> --
> http://fam-tille.d e
>

Cheers,
Ghis

>


Re: Where to put docs of a -doc package for python 2 + 3 modules?

2018-03-13 Thread Ghislain Vaillant
Le mardi 13 mars 2018 à 09:18 +0100, Thomas Goirand a écrit :
> On 03/13/2018 12:29 AM, Ghislain Vaillant wrote:
> > Imo, we should just make it clear in policy that source packages
> > should be named `foo` or `python-foo`,
> > and corresponding doc packages should be named `foo-doc` or
> > `python-foo-doc`.
> 
> Very often, "foo" is already taken by another package, and we have to
> fallback to python-foo. Think about generic libs for compression,
> images, network standards...

Agreed.

> Which is why I think we should have standardize on python-foo for the
> source package (which is what I do).

Same here.

Do you think we should be recommending the use of `python-foo` in the
DPMT packaging policy? This could help avoiding ITPs for `python3-bar`.

Ghis



Re: Where to put docs of a -doc package for python 2 + 3 modules?

2018-03-12 Thread Ghislain Vaillant
2018-03-12 22:30 GMT+00:00 W. Martin Borgert :
> On 2018-03-12 23:15, Thomas Goirand wrote:
>> But what now that python-foo is gone? Should I rename the doc package?
>
> No, but that's just my gut feeling.

Definitely not, indeed. The python- prefixes in python-foo and
python-foo-doc are not exactly equivalent.
The former usage refers to the Python 2 interpreter, the latter refers
to Python *the language*.

It's unfortunate our naming convention did not explicitly separate
between both as Arch packages do,
whereby `python` is used for source package and `python2` for binary
packages targeting CPython 2.x.

Imo, we should just make it clear in policy that source packages
should be named `foo` or `python-foo`,
and corresponding doc packages should be named `foo-doc` or `python-foo-doc`.

Ghis



RFS: pytest-qt/2.3.1-1

2018-02-22 Thread Ghislain Vaillant
Package: sponsorship-requests
Severity: normal

Dear mentors,

I am looking for a sponsor for my package "pytest-qt"

* Package name: pytest-qt
  Version : 2.3.1-1
  Upstream Author : Bruno Oliveira
* URL : https://github.com/pytest-dev/pytest-qt
* License : Expat
  Section : python

It builds those binary packages:

  python-pytestqt-doc - documentation for pytest-qt
  python3-pytestqt - pytest plugin for Qt application testing (Python 3)

To access further information about this package, please visit the
following URL:

  https://mentors.debian.net/package/pytest-qt

Alternatively, one can download the package with dget using this
command:

  dget -x 
https://mentors.debian.net/debian/pool/main/p/pytest-qt/pytest-qt_2.3.1-1.dsc

Changes since the last upload:

  [ Ondřej Nový ]
  * d/control: Set Vcs-* to salsa.debian.org

  [ Ghislain Antony Vaillant ]
  * New upstream version 2.3.1
  * Refresh the patch queue
  * Update the copyright years
  * Normalize the package descriptions
  * Drop the get-orig-source target
  * Bump the debhelper version to 11
  * Bump the standards version to 4.1.3
  * Set PYTEST_QT_API before running the tests
  * Increase verbosity of autopkgtests

Regards,
Ghislain Vaillant



RFS: python-mechanicalsoup/0.10.0-1

2018-02-22 Thread Ghislain Vaillant
Package: sponsorship-requests
Severity: normal

Dear mentors,

I am looking for a sponsor for my package "python-mechanicalsoup"

* Package name: python-mechanicalsoup
  Version : 0.10.0-1
  Upstream Author : Mirth Hickford <mirth.hickf...@gmail.com>
* URL : https://github.com/hickford/MechanicalSoup
* License : Expat
  Section : python

It builds those binary packages:

  python-mechanicalsoup - library for automating interaction with websites 
(Python 2)
  python3-mechanicalsoup - library for automating interaction with websites 
(Python 3)

To access further information about this package, please visit the
following URL:

  https://mentors.debian.net/package/python-mechanicalsoup

Alternatively, one can download the package with dget using this
command:

dget -x 
https://mentors.debian.net/debian/pool/main/p/python-mechanicalsoup/python-mechanicalsoup_0.10.0-1.dsc

Changes since the last upload:

  [ Ondřej Nový ]
  * d/control: Set Vcs-* to salsa.debian.org

  [ Ghislain Antony Vaillant ]
  * New upstream version 0.10.0 (Closes: #883366)
  * Refresh the patch queue
  * Update the copyright years
  * Drop the get-orig-source target
  * Normalize the package descriptions
  * Bump the debhelper version to 11
  * Bump the standards version to 4.1.3
  * Explicitly disable testing at build time.
Reason: Tests require network access
  * Add pytest-mock to the autopkgtest Depends

 Regards,
 Ghislain Vaillant



RFS: python-schema/0.6.7-1

2018-02-22 Thread Ghislain Vaillant
Package: sponsorship-requests
Severity: normal

Dear mentors,

I am looking for a sponsor for my package "python-schema"

* Package name: python-schema
  Version : 0.6.7-1
  Upstream Author : Vladimir Keleshev <vladi...@keleshev.com>
* URL : https://github.com/keleshev/schema
* License : Expat
  Section : python

It builds those binary packages:

  pypy-schema - simple data validation library (PyPy)
  python-schema - simple data validation library (Python 2)
  python3-schema - simple data validation library (Python 3)

To access further information about this package, please visit the
following URL:

  https://mentors.debian.net/package/python-schema

Alternatively, one can download the package with dget using this
command:

  dget -x 
https://mentors.debian.net/debian/pool/main/p/python-schema/python-schema_0.6.7-1.dsc

Changes since the last upload:

  [ Ondřej Nový ]
  * d/control: Set Vcs-* to salsa.debian.org

  [ Ghislain Antony Vaillant ]
  * Update the gbp configuration
  * New upstream version 0.6.7
  * Fixup whitespacing in rules file
  * Support the nocheck build profile
  * Filter egg-info with extend-diff-ignore
  * Bump the debhelper version to 11
  * Bump the standards version to 4.1.3
  * Update the copyright years

Regards,
Ghislain Vaillant



RFS: python-jsonrpc/1.10.8-1 [ITP]

2018-02-14 Thread Ghislain Vaillant
Package: sponsorship-requests
Severity: wishlist

Dear mentors,

I am looking for a sponsor for my package "python-jsonrpc"

* Package name: python-jsonrpc
  Version : 1.10.8-1
  Upstream Author : Kirill Pavlov <k...@p99.io>
* URL : https://github.com/pavlov99/json-rpc
* License : Expat
  Section : python

It builds those binary packages:

  python-jsonrpc-doc - documentation for json-rpc
  python3-jsonrpc - Python implementation of JSON-RPC 1.0 and 2.0 (Python 3)

To access further information about this package, please visit the
following URL:

  https://mentors.debian.net/package/python-jsonrpc

Alternatively, one can download the package with dget using this
command:

  dget -x 
https://mentors.debian.net/debian/pool/main/p/python-jsonrpc/python-jsonrpc_1.10.8-1.dsc

Packaging repository:

  https://salsa.debian.org/python-team/modules/python-jsonrpc

Debomatic build:

  
http://debomatic-amd64.debian.net/distribution#unstable/python-jsonrpc/1.10.8-1

Changes since the last upload:

  * Initial release. (Closes: #879050)

Regards,
Ghislain Vaillant



RFS: sphinxcontrib-doxylink/1.5-1

2018-02-13 Thread Ghislain Vaillant
Package: sponsorship-requests
Severity: normal

Dear mentors,

I am looking for a sponsor for my package "sphinxcontrib-doxylink"

* Package name: sphinxcontrib-doxylink
  Version : 1.5-1
  Upstream Author : Matt Williams
* URL : https://github.com/sphinx-contrib/doxylink
* License : BSD
  Section : python

It builds those binary packages:

  python3-sphinxcontrib.doxylink - Sphinx extension for linking to Doxygen 
documentation (Python 3)

To access further information about this package, please visit the
following URL:

  https://mentors.debian.net/package/sphinxcontrib-doxylink

Alternatively, one can download the package with dget using this
command:

  dget -x 
https://mentors.debian.net/debian/pool/main/s/sphinxcontrib-doxylink/sphinxcontrib-doxylink_1.5-1.dsc

Packaging repository:

  https://salsa.debian.org/python-team/modules/sphinxcontrib-doxylink

Debomatic build:

  
http://debomatic-amd64.debian.net/distribution#unstable/sphinxcontrib-doxylink/1.5-1

Changes since the last upload:

  [ Ondřej Nový ]
  * d/control: Set Vcs-* to salsa.debian.org

  [ Ghislain Antony Vaillant ]
  * d/gbp.conf: Sign all tags
  * d/gbp.conf: Drop pq section
  * Source future releases from GitHub
  * New upstream version 1.5
  * Drop the patch queue, fixed upstream
  * Drop the packaging for Python 2, dropped upstream
  * Add new build dependency on doxygen and pytest
  * Add pybuild setup for running the tests
  * Run autopkgtests for all supported Python versions
  * Point the Homepage URI to the GitHub repository
  * Update the copyright years
  * Bump the debhelper version to 11
  * Bump the standards version to 4.1.3
  * Normalize the package descriptions

Regards,
Ghislain Vaillant



Re: Salsa Vcs-* mass-commits

2018-02-12 Thread Ghislain Vaillant
Will it bypass packages for which such change has already been
committed, such as in src:flake8-polyfill (currently under RFS)? Just
checking.

Cheers,
Ghis

2018-02-12 13:41 GMT+00:00 Ondrej Novy :
> Hi,
>
> I would like to mass-commit to all DPMT's projects this:
>
> https://salsa.debian.org/python-team/modules/python-m3u8/commit/f2683222bb936c4f81047285fad2bb7a32e9087f
>
> Any thoughts?
>
> --
> Best regards
>  Ondřej Nový
>
> Email: n...@ondrej.org
> PGP: 3D98 3C52 EB85 980C 46A5  6090 3573 1255 9D1E 064B
>



Re: Help needed for Python3 package python3-bd2k which does not install due to syntax error

2018-02-12 Thread Ghislain Vaillant
Le lundi 12 février 2018 à 10:10 +0100, Piotr Ożarowski a écrit :
> > python-bd2k[1] which builds fine
> 
> it doesn't if you remove the override to not run Python 3.X tests

Indeed, the Python build system will almost always let you "build" a
package, even if it is not Python 3 ready.

Your only solace here is to run a proper migration to Python 3 (which
could be just a matter of running 2to3) and use the test suite (which
hopefully covers enough of the code base) to validate it.

Ghis



Re: Move to salsa? Merge modules and apps team?

2018-02-07 Thread Ghislain Vaillant
2018-02-07 8:58 GMT+00:00 Matthias Klose :
> On 07.02.2018 08:37, W. Martin Borgert wrote:
>> Hi,
>>
>> how about moving the Python team(s) to salsa?
>> And how about merging the modules and apps teams into one?
>>
>> Moving git packages (modules team) is very easy using
>> import.sh from https://salsa.debian.org/mehdi/salsa-scripts.git
>>
>> Moving svn packages (apps team) is probably more work.
>>
>> Any opinions? Any doubts?
>
> I don't think that is a good idea.  Both teams are not very active when it 
> comes
> to address RC issues and updating to new upstream versions.

How does that affect moving our hosting to salsa? Or is your point
against merging both teams? I am confused.

>  From my point of
> view the apps team is worse than the modules team in this regard.  Currently
> both teams really don't care about packages which are uploaded by other team
> members.  Maybe it's time to clean up the teams before any considerations to
> merge them?  For single maintainers we do have process for MIA, and for QA
> uploads, however both python teams are escaping such efforts.

Still unclear what all of this has to do with the salsa migration. I
understand your concerns, but imo they are orthogonal to the
transition.

Cheers,
Ghis



Re: Move to salsa? Merge modules and apps team?

2018-02-07 Thread Ghislain Vaillant
Le 7 févr. 2018 07:38, "W. Martin Borgert"  a écrit :

Hi,

how about moving the Python team(s) to salsa?


I'd be in favour for that.

And how about merging the modules and apps teams into one?


Same here. A single Python Team (python-team in salsa) would make sense.


Moving git packages (modules team) is very easy using
import.sh from https://salsa.debian.org/mehdi/salsa-scripts.git

Moving svn packages (apps team) is probably more work.


Could the import be done in stages then? Imo, packages which are ready for
transition should not have to wait.


Any opinions? Any doubts?

TIA & Cheers


Re: python-markdown and mkdocs circular build-dep

2018-01-11 Thread Ghislain Vaillant
2018-01-11 10:35 GMT+00:00 Dmitry Shachnev :
> Hi Brian and list,
>
> The new release of python-markdown has switched docs building from its own
> custom build system to mkdocs. However python-mkdocs itself build-depends on
> python3-markdown for tests, which results in a circular build-dependency.

Then in src:python-markdown ---> mkdocs 
And in src:mkdocs ---> python3-markdown 

This way, both ends of the circular dep are covered.



Re: RFP to ITP: python3-ratelimiter

2017-12-10 Thread Ghislain Vaillant
Source package name should be python-ratelimiter, not python3.

Le 10 déc. 2017 21:10, "chrysn"  a écrit :

I seem to have picked the wrong address for the list, re-sending it to
hopefully the right one:

(Original mail To: 880...@bugs.debian.org)

> retitle 880661 ITP: python3-ratelimiter -- simple Python library for
limiting the rate of operations
> thanks
>
> The snakemake workaround broke, and we'll need that anyway; starting to
> package this in the style of DPMT policy with the intention of
> maintaining it within the team.

Best regards
chrysn

--
To use raw power is to make yourself infinitely vulnerable to greater
powers.
  -- Bene Gesserit axiom


RFS: parsedatetime/2.4-3

2017-11-13 Thread Ghislain Vaillant

Dear DPMT,

Could someone sponsor the following upload for parsedatetime [1]. It 
fixes a bug in the Recommends metadata. The package update has been 
successfully validated on debomatic [2].


[1] 
https://anonscm.debian.org/cgit/python-modules/packages/parsedatetime.git
[2] 
http://debomatic-amd64.debian.net/distribution#unstable/parsedatetime/2.4-3/


Thanks,
Ghis



Re: python-csb: Please help: Clean target throws UnicodeDecodeError

2017-11-13 Thread Ghislain Vaillant



On 13/11/17 07:41, Andreas Tille wrote:

Hi,

I'm trying to upgrade python-csb to the new upstream version (in
Git[1]).  If I try to build it I get:


gbp:info: Building with (cowbuilder) for sid
gbp:info: Tarballs 'python-csb_1.2.5+dfsg.orig.tar.gz' not found at 
'../tarballs/'
gbp:info: Creating 
/home/andreas/debian-maintain/alioth/debian-med_git/build-area/python-csb_1.2.5+dfsg.orig.tar.gz
gbp:info: Exporting 'WC' to 
'/home/andreas/debian-maintain/alioth/debian-med_git/build-area/python-csb-tmp'
gbp:info: Moving 
'/home/andreas/debian-maintain/alioth/debian-med_git/build-area/python-csb-tmp' 
to 
'/home/andreas/debian-maintain/alioth/debian-med_git/build-area/python-csb-1.2.5+dfsg'
I: using cowbuilder as pbuilder
dpkg-source: info: applying exclude_online_tests.patch
dpkg-source: info: applying reproducible.patch
dh clean --with python2,python3 --buildsystem=pybuild
debian/rules override_dh_auto_clean
make[1]: Entering directory 
'/home/andreas/debian-maintain/alioth/debian-med_git/build-area/python-csb-1.2.5+dfsg'
dh_auto_clean
I: pybuild base:184: python2.7 setup.py clean
running clean
removing 
'/home/andreas/debian-maintain/alioth/debian-med_git/build-area/python-csb-1.2.5+dfsg/.pybuild/pythonX.Y_2.7/build'
 (and everything under it)
'build/bdist.linux-x86_64' does not exist -- can't clean it
'build/scripts-2.7' does not exist -- can't clean it
I: pybuild base:184: python3.6 setup.py clean
Traceback (most recent call last):
   File "setup.py", line 8, in 
 __doc__ = open('README.rst').read()
   File "/usr/lib/python3.6/encodings/ascii.py", line 26, in decode
 return codecs.ascii_decode(input, self.errors)[0]
UnicodeDecodeError: 'ascii' codec can't decode byte 0xc2 in position 4205: 
ordinal not in range(128)
E: pybuild pybuild:283: clean: plugin distutils failed with: exit code=1: 
python3.6 setup.py clean
dh_auto_clean: pybuild --clean -i python{version} -p 3.6 returned exit code 13
debian/rules:21: recipe for target 'override_dh_auto_clean' failed


Any idea how to fix this?


In the preamble of `setup.py` add:

```
from io import open
```

Then replace subsequent calls to `open` in the file with:

```
open(..., encoding="utf-8")
```

Worth forwarding the patch upstream.

Ghis



Re: providing sphinx3-* binaries

2017-10-03 Thread Ghislain Vaillant



On 03/10/17 22:46, Thomas Goirand wrote:

On 09/29/2017 01:08 PM, PICCA Frederic-Emmanuel wrote:

Hello guyes.


override_dh_sphinxdoc:
ifeq (,$(findstring nodocs, $(DEB_BUILD_OPTIONS)))

 
nodocs or nodoc

I alsa do something like this when there is extensions.

override_dh_sphinxdoc:
ifeq (,$(findstring nodoc, $(DEB_BUILD_OPTIONS)))
PYBUILD_SYSTEM=custom \
PYBUILD_BUILD_ARGS="cd docs && PYTHONPATH={build_dir} 
http_proxy='127.0.0.1:9' {interpreter} -m sphinx -N -bhtml source build/html" dh_auto_build  
# HTML generator
dh_installdocs "docs/build/html" -p python-gpyfft-doc
dh_sphinxdoc -O--buildsystem=pybuild
endif


In fact, I was thinking that probably, it'd be nicer to even do:

ifeq (,$(findstring nodoc, $(DEB_BUILD_PROFILES)))

and have Build-Profiles:  for the python-foo-doc package. This
could even become a standard in the DPMT if everyone agrees.

Thoughts anyone?


s/DEB_BUILD_PROFILES/DEB_BUILD_OPTIONS

Quoting the relevant part of the documentation for the nodoc build 
profile [1]:


"Builds that set this profile must also add nodoc to DEB_BUILD_OPTIONS"

[1] https://wiki.debian.org/BuildProfileSpec

Cheers,
Ghis



Re: pycharm package in debian

2017-10-01 Thread Ghislain Vaillant

On 01/10/17 20:33, Thomas Goirand wrote:

On 10/01/2017 09:47 AM, Ghislain Vaillant wrote:

Besides, rrom an end-user perspective, I can't picture anyone preferring
the (potentially lagging) packaged version over more official means like
the Jetbrains app or the snap package, both of which have been good at
keeping up with updates.


I definitively prefer a Debian package in main, even if it is "lagging
behind" as you said. For such a thing as an IDE, I expect it to be
mature enough so that the older version is enough for the everyday use.
And I would feel safer than using any random snap package. Who knows
what security issue is in there and what security policy and procedure
(if any) is in place.


You guessed it there are none. Confinement is disabled for the pycharm 
snap, so it behaves like any other application installed via a deb package.



Don't get me wrong, I understand the rationales from a DFSG perspective.
I am just questioning whether users of this particular piece of software
would particularly care.


I at least would care. And would very much welcome anyone doing the work
of packaging and maintaining this kind of software.


Don't get me wrong, I would welcome such effort too. I just wanted to 
emphasize the implication and level of commitment that such effort would 
require. We have been there with eclipse and atom, the former is (still) 
lagging quite badly, and the latter was stopped at the initial packaging 
stage.


Ghis



Re: pycharm package in debian

2017-10-01 Thread Ghislain Vaillant
Le 1 oct. 2017 15:53, "W. Martin Borgert" <deba...@debian.org> a écrit :

On 2017-10-01 08:26, Ghislain Vaillant wrote:
> May I ask what would be the benefit for pycharm to be in Debian, when we
> already have the official Jetbrains Toolbox App or the snap package as
means
> to install and update the application?

I usually start to use software, when it arrives in Debian.
Or I package it. If there is some snap or other third party
package, I'm unsure how to work with it:


You mean the average user cannot use Google? The installation instructions
on Jetbrain's website sounds pretty clear to me, so is installing a snap
package.


How to install? How to uninstall? How to report bugs and to
whom? How to download the source code and rebuild it? Is it
DFSG-free anyway? (Does it already build reproducible?)


You and I care about these things as Debian contributors. The average Joe
however usually does not.


There is nothing wrong with having snap or other packages
available, but I'm not their target audience. But I'm an Emacs
- and vi! - user anyway :~)


Whatever works for you. Actually, vim can be turned into a fine Python
editor.


Another question is, how much work it will be and whether it is
worth the effort, esp. permanent maintenance. But if somebody
wants to do it, why not?


Most likely a lot. We are talking about a large application with probably
quite a few dependencies in Java / Kotlin.

Why not? Because failure to commit to regular updates would feed the
current narrative that Debian ships old and loosely maintained software.
Especially when there are other means of installing the software which are
officially documented upstream.

I have been there with packages I personally maintain (spyder for
instance), and I am raising these concerns out of my own experience and
feedback from existing users. Feel free to disregard.

Ghis


Re: pycharm package in debian

2017-10-01 Thread Ghislain Vaillant
Le 1 oct. 2017 09:49, "Ben Finney" <bign...@debian.org> a écrit :

Ghislain Vaillant <ghisv...@gmail.com> writes:

> Don't get me wrong, I understand the rationales from a DFSG
> perspective. I am just questioning whether users of this particular
> piece of software would particularly care.

The same could be asked of many user-facing packages in Debian. Your
question, though, makes an incorrect assumption: that “users of this
particular piece of software” is a group whose membership is unaffected
by having the package in Debian.

On the contrary. Take me as a counter-example. I am not a user of this
particular piece of software, because I have little interest in judging
for myself the hundreds of user-facing applications on my system.

If it were in Debian I can then take all the assurance that brings about
freedom and maintenance, and I may indeed consider using this particular
piece of software where otherwise I would not.

So, one important reason to package a work in Debian is to *increase*
the set of people who can easily install and use it.


All 3 means of installation (Jetbrain's app, snap and potentially apt) are
one-liners for the end-user. So, ease-of-use is hardly a compelling
argument.

And I don't question your initial assessment about other applications in
the archive. Hence myself mentioning eclipse earlier, as a similar package
which used to be actively maintained until the effort died out.

I am just wondering whether an effort to package pycharm would not reach
the same outcome, assuming it passes the initial import phase. The Atom IDE
for instance never did.

Ghis


Re: pycharm package in debian

2017-10-01 Thread Ghislain Vaillant

On 01/10/17 08:38, Paul Wise wrote:

On Sun, Oct 1, 2017 at 3:26 PM, Ghislain Vaillant wrote:


May I ask what would be the benefit for pycharm to be in Debian, when we
already have the official Jetbrains Toolbox App or the snap package as means
to install and update the application?


I've never heard of the first of those, definitely wouldn't use the
snap package and probably not the Jetbrains thing, unless either of
them were built entirely from packages in Debian main, which I am
assuming they aren't ever going to be.


Sure, though I foresee the sheer amount of packaging work to get this 
sort of app in the archive to be quite a challenge, both from an initial 
packaging and on-going maintenance effort. Look at what happened with 
eclipse for instance.


Besides, rrom an end-user perspective, I can't picture anyone preferring 
the (potentially lagging) packaged version over more official means like 
the Jetbrains app or the snap package, both of which have been good at 
keeping up with updates.


Don't get me wrong, I understand the rationales from a DFSG perspective. 
I am just questioning whether users of this particular piece of software 
would particularly care.


Ghis



Re: pycharm package in debian

2017-10-01 Thread Ghislain Vaillant

On 01/10/17 02:36, Paul Wise wrote:

On Sat, Sep 30, 2017 at 10:35 PM, Julien Puydt wrote:

Le 30/09/2017 à 14:22, kamaraju kusumanchi a écrit :

Are there any plans to make a debian package of pycharm that is part
of official debian? I used their community edition on windows 7 and it
is awesome.


Maybe you should look at WNPP to see if someone filed a RFP or ITP, and
if not, submit a RFP yourself?


Looks like someone attempted it but gave up, so if you would like to
do it that would be great.

https://bugs.debian.org/742394
https://mentors.debian.net/intro-maintainers


May I ask what would be the benefit for pycharm to be in Debian, when we 
already have the official Jetbrains Toolbox App or the snap package as 
means to install and update the application?


Ghis



Re: Python 3 Statsmodels & Pandas

2017-09-23 Thread Ghislain Vaillant

On 22/09/17 20:32, Diane Trout wrote:

On Fri, 2017-09-22 at 10:57 +0200, Piotr Ożarowski wrote:

[Diane Trout, 2017-09-21]

I made larger changes to statsmodels, by using pybuild instead of
the
previous multiple targets in debian/rules.


you can simplify it even further by using pybuild's --ext-dest-dir:
(I didn't test as this branch FTBFS for me)


Ooh. I didn't know about PYBUILD_EXT_DEST_DIR_python3

that's useful.

Where should that option be documented?


In pybuild's manpage for a start?



Re: Python 3 Statsmodels & Pandas

2017-09-21 Thread Ghislain Vaillant

On 21/09/17 15:22, Andreas Tille wrote:

We somehow need to get some working spatstats to continue with other
packages.


I second Andreas' opinion. In principle, we'd want all tests to run at 
build and integration times and we might achieve that at some point 
during the Buster cycle.


Right now, this situation with pandas and statsmodels is slowing work on 
other scientific packages down significantly.


Cheers,
Ghis



Re: pydist and install_requires

2017-08-17 Thread Ghislain Vaillant

On 17/08/17 10:43, PICCA Frederic-Emmanuel wrote:

Hello Andrey


Isn't just adding the package names to Depends easier?


I just want this to be automatically generated and "upstreamable".


So do you have python-opengl, python-pyqt5 etc in Build-Depends?


yes, I think that I found the problem for opengl.
the egg info gives the name of the project which is PyOpenGL. (I am testing 
this now)


Indeed, you need to use the name registered on pypi, which can be 
different from the Debian package name. For OpenGL, the project is 
registered as PyOpenGL, for PyQt5 the name is PyQt5.



but for pyqt5 I do not have egg informations. Maybe the solution would be to 
add egg info to pyqt5.
But for the split ???


There are no individual registered names for the PyQt5 components, so 
you can only provide PyQt5 in the setup.py metadata, and you must list 
the necessary components yourself in d/control.


Ghis



Re: a few quick questions on gbp pq workflow

2017-08-06 Thread Ghislain Vaillant

On 06/08/17 19:56, Scott Kitterman wrote:

Generally when I find shortcomings in the tarball, I file bugs upstream.  In 
general, I've found upstream developers to be accepting of such changes.


Same here.


There's no need to DFSG the tarball if you can rebuild the docs.  The best way 
to ensure that is to rebuild them during the build process.


Also, the HTML docs are sometimes included by mistake because the docs 
folder was grafted and the Sphinx build directory not pruned in the 
corresponding MANIFEST.in.




Re: a few quick questions on gbp pq workflow

2017-08-06 Thread Ghislain Vaillant

On 06/08/17 19:53, Jeremy Stanley wrote:

Why would you need to repack a tarball just because it contains
prebuilt docs (non-DFSG-free licensed documentation aside)? I'm all
for rebuilding those at deb build time just to be sure you have the
right deps packaged too, but if the ones in the tarball are built
from DFSG-compliant upstream source, included in the archive for
that matter, then leaving the tarball pristine shouldn't be a policy
violation, right? That's like repacking a tarball for an
autotools-using project because upstream is shipping a configure
script built from an included configure.in file.


You'd still have to clean the pre-built files, since they would be 
overwritten by the build system and therefore dpkg-buildpackage would 
complain if you run the build twice.


So, you might as well just exclude them from the source straight away, no?

Ghis



Re: dpkg-buildpackage -B fails

2017-08-04 Thread Ghislain Vaillant

On 04/08/17 10:39, Gudjon I. Gudjonsson wrote:

Hi list

I wanted to fix my terrible track record lately, fix bugs and update my packages
but I ran into problems with sbuild on my package eric.

dpkg-buildpackage -B fails with the following error message:

  dpkg-genbuildinfo --build=any
dpkg-genbuildinfo: error: binary build with no binary artifacts found;
.buildinfo is meaningless


Because all your packages listed in d/control are Architecture: all? In 
this case the error message is accurate: dpkg-buildpackage -B yields no 
binary packages.


Ghis



Re: building manpages via setup.py

2017-08-02 Thread Ghislain Vaillant

On 02/08/17 10:45, PICCA Frederic-Emmanuel wrote:

First, that's very speculative. Second, that's upstream's problem.

# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
# import os
# import sys
# sys.path.insert(0, os.path.abspath('.'))

You need to customize the sys.path in order to find the extensions.


The snippet you quoted is not specific to extension modules but to the 
use of the autodoc feature, which requires the modules to be in the 
PYTHONPATH. The `sys.path.insert` hack is just here so that you don't 
have to specify PYTHONPATH yourself when running the upstream Makefile.



This path can change depending on the setup.py build options so this is not 
reliable.


I don't understand how setup.py / build options are affecting the Sphinx 
documentation. You are supposed to either call the generator manually 
via sphinx-build (which the style guide recommends), or using the 
upstream Makefile (which upstream often does).


As far as reliability is concerned, my reference here is upstream. If 
upstream can produce the docs, then I should be able to do it too. And 
they don't have pybuild for that.



Why? All you need is *one* occurrence of the extension modules somewhere
in the PYTHONPATH in order to generate the docs. Chances are that's how
upstream generates them.


Because by experience I find issue in the build system and the python code when 
building the doc for multiple
interpreter (python2/python3 differences...)


Why would you build the docs for each supported Python version, 
considering you'll end up shipping only one instance of the generated 
HTML? I am probably missing something here.


You might as well generate the docs just once with the default Python 2 
or Python 3 interpreter, the same way you would do without extension 
modules, no?


Pardon my insistence, but I really fail to understand what issues you 
are referring to.



Found a total of 643 reverse build-depend(s) for python-all-dev.

not that small



How is the ratio over all the Python packages? I suspect very small.


Found a total of 1968 reverse build-depend(s) for python-all.

not that small 32 % ;)


More than I expected, indeed.

And amongst these 643 packages, how many have such large build times 
that the overhead of an additional inplace call would be considered 
prohibitive, I wonder?


Besides, the docs will typically be processed by arch-all builders 
(provided you use -indep targets), so arch-any builds won't even feel 
that overhead, right?


I could be wrong though.



Re: building manpages via setup.py

2017-08-02 Thread Ghislain Vaillant



On 02/08/17 09:55, PICCA Frederic-Emmanuel wrote:

PYTHONPATH=. sphinx-build -N -b html  



One can also use the sphinx-generated Makefile if available:



PYTHONPATH=$(CURDIR) $(MAKE) -C  html



Both are simple one-liners and do not rely on pybuild.


Yes it works but this is fragile since the organisation of the module can 
change in the sources.


First, that's very speculative. Second, that's upstream's problem.

I have not seen many upstream projects playing with the layout of their 
modules from one version to the next. If that's the case, then there are 
worst things to be worried about (API breakage for instance) than the docs.



at least the .pybuild directory is under the responsability of pybuild and we 
should use pybuild instead of relying
on the maintainer snipset. (typo error, change during the time.)


The upstream Makefile and conf.py are likely generated by Sphinx itself 
via sphinx-quickstart. Did your upstream tinker with them that much that 
they cannot be trusted?



It would be nice if the doc generation in python shold be standardize.


Some upstream do use a build_sphinx command, but it is far from common 
and it does not solve the extension module problem.



If it does not cost much to build the extension inplace, then the
simplest option is to prefix one of these calls above with:



python3 setup.py build_ext --inplace


when you have multiple verison of the interpreter you prefer to avoid --inplace.


Why? All you need is *one* occurrence of the extension modules somewhere 
in the PYTHONPATH in order to generate the docs. Chances are that's how 
upstream generates them.


I fail to picture how this is an issue in practice considering your 
build override will run the isolated pybuilds first and the sphinx call 
last.



If the cost is prohibitive, which arguably applies to a very limited set
of packages (yours included) then you would use pybuild for that, as
Piotr kindly suggested.


Yes it depends olsa of the arch. Some are really slow


That's unfortunate, indeed.


Considering the rarity of this use case though, I wonder whether it is
worth adding a separate section to the style guide.


Found a total of 643 reverse build-depend(s) for python-all-dev.

not that small


How is the ratio over all the Python packages? I suspect very small.



Re: building manpages via setup.py

2017-08-02 Thread Ghislain Vaillant

On 02/08/17 09:19, PICCA Frederic-Emmanuel wrote:

At the end of the day, it is just a matter of providing an appropriate
PYTHONPATH, regardless of whether pybuild is used or not.


Yes but to avoid the multiplications of way to provide this PYTHONPATH.


For the vast majority of packages, the current method listed in 
LibraryStyleGuide applies, i.e.:


PYTHONPATH=. sphinx-build -N -b html  

One can also use the sphinx-generated Makefile if available:

PYTHONPATH=$(CURDIR) $(MAKE) -C  html

Both are simple one-liners and do not rely on pybuild.


Is it possible to have the recommended way which works for modules and 
extensions.


If it does not cost much to build the extension inplace, then the 
simplest option is to prefix one of these calls above with:


python3 setup.py build_ext --inplace

If the cost is prohibitive, which arguably applies to a very limited set 
of packages (yours included) then you would use pybuild for that, as 
Piotr kindly suggested.



once agreed, we should put this in the wiki


Considering the rarity of this use case though, I wonder whether it is 
worth adding a separate section to the style guide.


Ghis



Re: building manpages via setup.py

2017-08-02 Thread Ghislain Vaillant

On 02/08/17 09:03, PICCA Frederic-Emmanuel wrote:

Perhaps the LibraryStyleGuide should be updated to reflect on this
change? I believe we are still advising explicit http_proxy /
https_proxy exports prior to running sphinx-build.


And running sphinx-build does not work expecially if there is extensions in the 
documentation.
sphinx-build should be run via pybuild in order to know about the build_dir.

right ?


At the end of the day, it is just a matter of providing an appropriate 
PYTHONPATH, regardless of whether pybuild is used or not.


The default is $(CURDIR) for the vast majority of packages. But, it may 
also be something else if extension packages are involved, or if the 
package directory is under a different folder than the root directory, 
such as src/.


Ghis



Re: building manpages via setup.py

2017-08-02 Thread Ghislain Vaillant



On 02/08/17 08:44, Piotr Ożarowski wrote:

[PICCA Frederic-Emmanuel, 2017-08-02]

you can drop it, PYTHONPATH and http_proxy should be set by pybuild


Is it true  for jessie

I need to support jessie and stretch

And even debian7...


I didn't test it even for unstable, but IIRC pybuild exports those in
all steps since a long time ago. You'll know after first builds...


Perhaps the LibraryStyleGuide should be updated to reflect on this 
change? I believe we are still advising explicit http_proxy / 
https_proxy exports prior to running sphinx-build.


Ghis



Re: building manpages via setup.py

2017-08-01 Thread Ghislain Vaillant

On 01/08/17 15:15, PICCA Frederic-Emmanuel wrote:

Hello,

I am working on the pyfai package.
This pacakge contain one module with extensions (the important point)

The new upstream version 0.14.0 provide a build_man target via the setup.py

So in ordert to generate the doc I need to do

python setup.py build_man

Now if I look at this target, I can find this code

-

class BuildMan(Command):
 """Command to build man pages"""
 user_options = []

 def initialize_options(self):
 pass

 def finalize_options(self):
 pass

 def entry_points_iterator(self):
 """Iterate other entry points available on the project."""
 entry_points = self.distribution.entry_points
 console_scripts = entry_points.get('console_scripts', [])
 gui_scripts = entry_points.get('gui_scripts', [])
 scripts = []
 scripts.extend(console_scripts)
 scripts.extend(gui_scripts)
 for script in scripts:
 elements = script.split("=")
 target_name = elements[0].strip()
 elements = elements[1].split(":")
 module_name = elements[0].strip()
 function_name = elements[1].strip()
 yield target_name, module_name, function_name

 def run(self):
 build = self.get_finalized_command('build')
 path = sys.path
 path.insert(0, os.path.abspath(build.build_lib))

 env = dict((str(k), str(v)) for k, v in os.environ.items())
 env["PYTHONPATH"] = os.pathsep.join(path)

 import subprocess

 status = subprocess.call(["mkdir", "-p", "build/man"])
 if status != 0:
 raise RuntimeError("Fail to create build/man directory")

 import tempfile
 import stat
 script_name = None

 entry_points = self.entry_points_iterator()
 for target_name, module_name, function_name in entry_points:
 logger.info("Build man for entry-point target '%s'" % target_name)
 # help2man expect a single executable file to extract the help
 # we create it, execute it, and delete it at the end

 py3 = sys.version_info >= (3, 0)
 try:
 # create a launcher using the right python interpreter
 script_fid, script_name = tempfile.mkstemp(prefix="%s_" % 
target_name, text=True)
 script = os.fdopen(script_fid, 'wt')
 script.write("#!%s\n" % sys.executable)
 script.write("import %s as app\n" % module_name)
 script.write("app.%s()\n" % function_name)
 script.close()
 # make it executable
 mode = os.stat(script_name).st_mode
 os.chmod(script_name, mode + stat.S_IEXEC)

 # execute help2man
 man_file = "build/man/%s.1" % target_name
 command_line = ["help2man", script_name, "-o", man_file]
 if not py3:
 # Before Python 3.4, ArgParser --version was using
 # stderr to print the version
 command_line.append("--no-discard-stderr")

 p = subprocess.Popen(command_line, env=env)
 status = p.wait()
 if status != 0:
 raise RuntimeError("Fail to generate '%s' man 
documentation" % target_name)
 finally:
 # clean up the script
 if script_name is not None:
 os.remove(script_name)
-


As you can see this create a launch script for each entry point found in the 
setup.py and run help2man on it.

For now I would like to use this setup.py without modification.

So what should I do to run

python setup.py build_man with the options provided by pybuild during the 
normal build in order to let the  generated script find the pyFAI modules and 
its extensions ?


Simplest I can think of would be to build the extensions inplace 
followed by the call to build_man. Something like:


override_dh_auto_build:
dh_auto_build
python3 setup.py build_ext --inplace
python3 setup.py build_man

I left the http_proxy exports and nodoc guards out for clarity.


Second questions what is the right way to generate the man pages for a python 
application ?


Usually via Sphinx if the upstream documentation uses it. Regardless of 
the stack, `help2man` is often considered the poor man choice for 
generating manpages.


Let me know if you'd like me to have a look :-)

Ghis



News regarding the ITP for python-coloredlogs

2017-07-24 Thread Ghislain Vaillant
On Sun, 5 Feb 2017 23:51:57 +0530 Gaurav Juvekar  wrote:
> On Sun, 5 Feb 2017 18:24:27 +0100 Adam Borowski 
wrote:
> > However, the second part, converting logs to HTML, is not only
redundant
> > with ansi2html (package colorized-logs, split out from kbtin) and
aha, but
> > also too simplicistic and buggy to do its job.
> 
> > I think it'd be best if you dropped the "coloredlogs" binary and
left just
> > the python logging libraries; the alternative would involve
reimplementing
> > it basically from scratch.
> 
> Hi,
> 
> I agree with you. I have removed the binary package, and renamed the
source package to python-coloredlogs from coloredlogs.
> 
> I have re-uploaded the package at https://mentors.debian.net/package/
python-coloredlogs
> You can download it with dget as 
>dget -x https://mentors.debian.net/debian/pool/main/p/python-color
edlogs/python-coloredlogs_5.2-1.dsc

This issue is currently blocking my work on the spyder-terminal plugin,
for which coloredlogs is a dependency. Could you guys keep me updated
on the progress.

I'd be happy to take over if the motivation is gone.

Cheers,
Ghis



Re: spyder3 missing rope dependency

2017-06-04 Thread Ghislain Vaillant
On Sun, 2017-06-04 at 09:47 -0400, kamaraju kusumanchi wrote:
> On Sun, Jun 4, 2017 at 5:00 AM, Ghislain Vaillant <ghisv...@gmail.com> wrote:
> > A new version of Spyder was also uploaded to experimental (3.1.4),
> > which adds the Python 3 rope dependency. Despite the warning message,
> > the absence of rope is harmless. The Spyder IDE works perfectly fine
> > without.
> 
> Thanks. The IDE does seem to work fine without this functionality. Is
> there any way to disable this check? I briefly looked in Tools ->
> Preferences but could not find anything relevant.

Not that I am aware of. I suspect you'd have to hack in the source code
to silence this.

Ghis



Re: spyder3 missing rope dependency

2017-06-04 Thread Ghislain Vaillant
On Sun, 2017-06-04 at 00:40 -0400, kamaraju kusumanchi wrote:
> Launching spyder3 gives the following error
> 
> You have missing dependencies!
> rope >=0.9.4: None (NOK)
> Please install them to avoid this message.
> 
> I see a python-rope package but no analogous python3-rope. Any idea
> how to fix this?

The python3-rope package was made available in experimental, since a
Python 3 compatible version of Rope did not land in Debian in time
before the freeze.

A new version of Spyder was also uploaded to experimental (3.1.4),
which adds the Python 3 rope dependency. Despite the warning message,
the absence of rope is harmless. The Spyder IDE works perfectly fine
without.

Cheers,
Ghis



Re: [Python-modules-team] partd_0.3.8-1_source.changes ACCEPTED into unstable

2017-06-02 Thread Ghislain Vaillant
On Fri, 2017-06-02 at 11:35 -0400, Barry Warsaw wrote:
> Hi Diane,
> 
> On Jun 02, 2017, at 05:48 AM, Debian FTP Masters wrote:
> 
> > partd (0.3.8-1) unstable; urgency=medium
> > .
> >   * Switch from git-dpm to gbp
> 
> I may be misremembering our previous discussions on the topic, but we all know
> that in the long term we want to convert off of git-dpm.

Indeed.

> My recollection was that in the short term, "officially" we want to
> opportunistically convert packages as an experiment, and to work out the steps
> needed, so that at some time in the future, we'd mass migrate the bulk of our
> packages.

Correct. However, I have collaborated with Diane on some of her
packages (dask) and the git-dpm setup was broken. So, I took this
opportunity to get rid of git-dpm and switch to gbp with her approval.

> I'm curious, did you follow the steps outlined in
> https://wiki.debian.org/Python/GitPackagingPQ to do the conversion?  (Search
> for "Converting git-dpm to gbp pq") or did you follow some other process?  Are
> you adopting gbp-pq for your own workflow?  Do you have any insights that will
> help others convert, or that can guide our future mass migration?  Is there
> anything you can add to the wiki page to help others when they
> opportunistically convert?

I can't speak for herself for partd.

For dask, the patch queue was not recognized by git-dpm, so the source
tree was essentially in patch non-applied mode. The conversion was then
straightforward: delete .git-dpm, add gbp.conf and do a round trip of
gbp pq import / export.

> For the team: should we just allow opportunistic conversions and live with a
> mixed git-dpm/gbp state in our packages for a while?

IMO, I believe it is not the end of the world, but I am obviously
biased.

Ghis



Re: Joining the team

2017-05-20 Thread Ghislain Vaillant
On Sat, 2017-05-20 at 17:40 +0200, sab wrote:
> 
> On 07/05/2017 17:30, sab wrote:
> > 
> > On 06/05/2017 17:14, Piotr Ożarowski wrote: 
> > > [sab, 2017-05-06] 
> > > > 'https://anonscm.debian.org/git/python-modules/packages/python-zxcvbn.git/':
> > > >  
> > >  git+ssh://git.debian.org/git/python-modules/packages/python-zxcvbn.git 
> >  Thanks 
> > I have pushed on 
> > git+ssh://anonscm.debian.org/git/python-modules/packages/python-zxcvbn.git/
> > for all branches. 
> > Now who sponsors the changes? 
> > Regards, 
> > Sabino 
> > 
>  
> Hi!
> How do I make package accepted into experimental?
> https://anonscm.debian.org/cgit/python-modules/packages/python-zxcvbn.git/
> Regards,
> Sabino

You need to request sponsorship for uploading your package [1]. Then,
an interested sponsor should review your packaging and may ask for
further improvement to it prior to the upload.

[1] https://mentors.debian.net/sponsor/rfs-howto

Ghis



Re: python-parse-type

2017-05-16 Thread Ghislain Vaillant
On Tue, 2017-05-16 at 12:36 +0200, Piotr Ożarowski wrote:
> [Brian May, 2017-05-16]
> > python-enum - robust enumerated type support in Python
> 
> ah, that's why python-enum34 name was chosen, there's even
> "Breaks: python-enum" - I've missed it because it's no longer in
> unstable

Indeed both (enum and enum34) are different projects. The latter is
officially described as a backport of the enum module introduced since
Python 3.4 to previous versions. That's probably the one you should be
using.

Ghis



Re: [Python-modules-commits] [python-cpuinfo] 02/02: Import Debian changes 3.0.0-1

2017-04-16 Thread Ghislain Vaillant
On Sun, 2017-04-16 at 18:09 +0200, Mattia Rizzolo wrote:
> On Sun, Apr 16, 2017 at 05:50:54PM +0200, Hugo Lefeuvre wrote:
> > I introduced an additional binary package for this script because I thought
> > people cold have found it useful. But, right, everything considered I should
> > better drop it.
> 
> Wait a second before dropping this..
> 
> What would be the downside of having it in a separate package?  I
> concur that the "py-" prefix strikes as odd, but I otherwise generally
> recommend keeping /usr/bin/* stuff out of python-* packages, while
> keeping in the latter only the python module, for a bunch of reasons.

Let me quote that upstream README for you:

```
Run as a script

$ python cpuinfo/cpuinfo.py

Run as a module

$ python -m cpuinfo

Run as a library

import cpuinfo
info = cpuinfo.get_cpu_info()
print(info)
```

Nowhere is there mentioned a `py-cpuinfo` executable. The instructions
given upstream for command-line usage are via a separate wrapper script
named `cpuinfo.py` or a module call.

In fact, if you install `py-cpuinfo` in a venv:
```
python3 -m venv py-cpuinfo
source py-cpuinfo/bin/active
pip install py-cpuinfo
ls py-cpuinfo/bin/
``

You will only find an entry-point called `cpuinfo`.

So what you guys are proposing is to introduce a new wrapper script, in
its own binary package, whose name is not endorsed by upstream, and
which will end-up completely Debian specific.

Am I really the only one in this team to think this proposal is a
complete non-sense?

> Surely I'm not the only one who would consider moving the file back to
> python3-cpuinfo a step backward…

I fail to understand how your anti-Python-3 feelings add anything
constructive to this thread. Moving on.

AFAIC, I happily use pytest or sphinx via their respective python[3]-
pytest and python[3]-sphinx. I don't consider the lack of a dedicated
pytest or sphinx binary package a step backward.

Regards,
Ghis



Re: [Python-modules-commits] [python-cpuinfo] 02/02: Import Debian changes 3.0.0-1

2017-04-16 Thread Ghislain Vaillant
Also, the `cpuinfo` utility can be invoked with `python[3] -m cpuinfo` 
according to the upstream README [1]. So, I am not convinced of the 
benefit of introducing an additional binary package (py-cpuinfo) for 
something the library packages already provide.


[1] https://github.com/workhorsy/py-cpuinfo#run-as-a-module

Ghis


On 16/04/17 16:24, Sandro Tosi wrote:

well, the py- prefix seems wrong as well (and not part of the recommendation)

On Sun, Apr 16, 2017 at 5:44 AM, Ondrej Novy  wrote:

Hi,


2017-04-14 20:25 GMT+02:00 Sandro Tosi :


why the cli tools are in a separate packages, instead of being inside
the py3k package (as it seems to suggest it uses the python3
module to work)?



because it's one of our team recommendation:
https://wiki.debian.org/Python/LibraryStyleGuide#Executables_and_library_packages

But section of py-cpuinfo binary package is wrong.

--
Best regards
 Ondřej Nový

Email: n...@ondrej.org
PGP: 3D98 3C52 EB85 980C 46A5  6090 3573 1255 9D1E 064B





Re: Fwd: next version of csvkit

2017-04-02 Thread Ghislain Vaillant

On 02/04/17 08:39, Vincent Bernat wrote:

 ❦  1 avril 2017 19:42 -0400, Sandro Tosi  :


It's not at all clear where [1] came from.  The lintian changelog [3] does not
give a bug reference and I couldn't find a bug.


it's just a few lines down in the changelog:
https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=829744 (it is kinda
sad that there was no discussion with the python team from the lintian
maintainer before accepting and merging it, even if it was done after
stretch freeze, which was indeed a clever move)


I'll just point out that Scott did contribute to the discussion which 
lead to the introduction of this Lintian tag in the bug report mentioned 
above.



It's a general trend with Lintian: it's easier to push for a Lintian tag
in a random bug report than getting a consensus and translate it to a
Lintian tag.


The introduction of the Lintian tag was ack'd by a member of the team 
(see message 40). Sure this is no consensus, but the decision was not 
"random" either.


CC'ing lamby who might want to shed some light on this.


On the current subject, I also agree we should not drop prematurely
packages targeted to Python 2. It is likely the support will be extended
past 2020, at least by distributions with a 10-year support.



IMO, it's not our job to decide how the ecosystem should work. We will
be alienating our own users. We are not in the strong position we were
10 years ago and those users will just switch to another distribution.


Please focus on the current package (csvkit). It is an **application** 
package, so whether the console scripts are called with Python 2 or 
Python 3 really does not matter.


Perhaps it used to be the case in the past, but the library component 
has been deported to the agate packages, for which I answered Sandro's 
request to package. The reward I am getting is anger and frustration 
from the team, despite my good will. Not cool :-(


Nowadays, the binary package produced by src:csvkit might as well be 
called `csvkit`, and be installed somewhere under /usr/share instead of 
the system site-packages for what it is worth.


Ghis



Re: PyPI source or github source?

2017-03-13 Thread Ghislain Vaillant
On Tue, 2017-03-14 at 08:32 +1100, Brian May wrote:
> Scott Kitterman  writes:
> 
> > Like Barry, I've never had an issue with upstreams fixing their MANIFEST.in 
> > so 
> > that the sdist is complete when I point out the issue.
> 
> I have had some people who do argue that sdist is an installation
> format, not a source format - if you want the source use github.

Same here.

> I think most of them eventually do change, however it makes me a bit
> uneasy that maybe they might be right.

Until this thread was started, I had never questioned the
"canonicalness" of PyPI releases either. 

> As a result, some of my recent packages I have used github, rather
> then try to "fix" things on PyPI. That way I can be sure that the
> build will always be consistant, and not suddenly and unexpectedly
> drop files.

Do you get rid of the useless dotfiles (gitignore, ci settings, tox...)
or leave them alone then?

Ghis 



Re: PyPI source or github source?

2017-03-12 Thread Ghislain Vaillant
On Sun, 2017-03-12 at 10:53 +0800, Paul Wise wrote:
> On Sun, Mar 12, 2017 at 10:19 AM, Brian May wrote:
> 
> > Sure, you could argue that PyPI source packages should contain
> > everything the github package does. In fact there is a PyPI tool to help
> > get the MANIFEST.in correct for such purposes -
> > https://pypi.python.org/pypi/check-manifest
> 
> Anyone interested in packaging this?

There is an RFP filed for it [1]. I could have a look at it. I recently
found out that this tool is listed in the PyPA sample project [2].

[1] https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=734117
[2] https://github.com/pypa/sampleproject/blob/master/setup.py

> > Unfortunately, github releases cannot (AFAIK) easily be signed, unless
> > you retrieve signed git tags directly from git (which is not supported
> > by uscan AFAIK). Would be good if gbp buildpackage supported signing git
> > tags, I don't think it does either.
> 
> uscan does support git but doesn't check OpenPGP signatures on tags.
> It would probably be easy to add that, please file a bug about it.
> 
> > * Do we consider signed git tags / commits secure, considering they are
> >   based on SHA1?
> 
> Better than having unsigned tags/commits.
> 
> > * Is there any point having signed PyPI releases when (very likely) the
> >   underlying upstream git repository has no signatures?
> 
> Yes, presumably the PyPI releases are built from the author's copy of
> the git repository, rather than directly from the online repository,
> hopefully they have verified all commits they pulled into it.
> 
> > * Is there any point having signed PyPI releases when (very likely) the
> >   public key is stored in an insecure DPMT respository on
> >   git.debian.org?
> 
> Yes, it is also stored in immutable places like the archive and snapshot.d.o.

My personal gripe with GitHub releases is that they are often full of
unwanted stuff, such as various CI config files (travis, circle,
appveyor...), test config files (pytest, nose, tox...), conda scripts,
GitHub files (.github/ directory) and whatnot. They are not harmful but
it is clutter the distributed sources could definitely live without.

On the other hand, I have seen very few pieces of software which had a
*comprehensive* MANIFEST.in for generating a tarball suitable for
packaging. The file is often either absent, or missing inclusion of the
docs, tests, change log or license files. Upstream is usually receptive
in providing "better" source tarballs, but I have had some developers
taking an aggressive stance towards keeping the PyPI tarball as minimal
as possible in the past.

Ghis



Re: Moving a package from collab-maint to python-modules

2017-03-11 Thread Ghislain Vaillant
On Sat, 2017-03-11 at 18:14 +, Scott Kitterman wrote:
> 
> On March 11, 2017 6:52:59 AM EST, Ghislain Vaillant <ghisv...@gmail.com> 
> wrote:
> > On Sat, 2017-03-11 at 11:24 +, Christopher Hoskin wrote:
> > > Hello,
> > > 
> > > I'd like to package python-jsonpointer for Debian. The filer of the
> > 
> > RFP (Bug #754296) Pietro Battiston, has created a repository at
> > > 
> > > https://anonscm.debian.org/cgit/collab-maint/python-jsonpointer.git
> > > 
> > > but has no intention of becoming the maintainer, and the package has
> > 
> > not been uploaded. The existing repository does not use git-dpm or
> > pristine-tar.
> > > 
> > > I'd like to maintain this package within DPMT. Is there a way I can
> > 
> > migrate the existing repository, or should I just start again?
> > > 
> > > Thanks.
> > > 
> > > Christopher Hoskin
> > 
> > I know Pietro (I co-maintain src:bottleneck with him) and he has been
> > keen on moving his packages to team-maintenance under the DPMT.
> > However, since the package in question was never released, and is
> > essentially RFP's now, you can probably start from scratch.
> > 
> > I am wondering whether it makes sense to use git-dpm for new packages,
> > now that the team is planning to transition from using git-dpm to gbp.
> > I personally used gbp straight away for my recent packages (see pytest-
> > qt, pytest-xvfb) for examples.
> 
> I think we should be consistent like we were with svn/git.  Stick with one 
> thing and then switch everything over.  If we don't, we aren't acting as a 
> team.
> 
> Scott K

Fair enough.

Christopher, please disregard my previous comment.

Ghis



Re: Moving a package from collab-maint to python-modules

2017-03-11 Thread Ghislain Vaillant
On Sat, 2017-03-11 at 11:24 +, Christopher Hoskin wrote:
> Hello,
> 
> I'd like to package python-jsonpointer for Debian. The filer of the RFP (Bug 
> #754296) Pietro Battiston, has created a repository at
> 
> https://anonscm.debian.org/cgit/collab-maint/python-jsonpointer.git
> 
> but has no intention of becoming the maintainer, and the package has not been 
> uploaded. The existing repository does not use git-dpm or pristine-tar.
> 
> I'd like to maintain this package within DPMT. Is there a way I can migrate 
> the existing repository, or should I just start again?
> 
> Thanks.
> 
> Christopher Hoskin

I know Pietro (I co-maintain src:bottleneck with him) and he has been
keen on moving his packages to team-maintenance under the DPMT.
However, since the package in question was never released, and is
essentially RFP's now, you can probably start from scratch.

I am wondering whether it makes sense to use git-dpm for new packages,
now that the team is planning to transition from using git-dpm to gbp.
I personally used gbp straight away for my recent packages (see pytest-
qt, pytest-xvfb) for examples.

Hope this helps,
Ghis



Re: I want to maintain my current packages within the team

2017-03-10 Thread Ghislain Vaillant

On 10/03/17 08:27, Thomas Güttler wrote:

Hi,

I am following this instruction:
http://python-modules.alioth.debian.org/policy.html

Yes, I accept the above policy.

Yes, collaborative maintenance is preferred.

Here is what I want to package: https://github.com/guettli/reprec

Up to now reprec contains these tools:

  * reprec: Replace strings in text files. Can work recursive in a
directory tree
  * setops: Set operations (union, intersection, ...) for line based
files.



Please consider motivating the packaging of this software, i.e. how it 
is potentially useful to Debian users, what is the novelty versus other 
file processing tools, etc...



Acoording to the policy I should include my alioth login. Up to now I
have non.


You need to create your account on Alioth first [1].

[1] https://alioth.debian.org/


What is the next stop now? If possible the username should be "guettli".


There is a high chance it will end-up being "guettli-guest". Please 
confirm once you have successfully created your account.


Cheers,
Ghis



Re: Moving off of git-dpm (Re: git-dpm breakage src:faker)

2017-03-09 Thread Ghislain Vaillant
On Thu, 2017-03-09 at 21:13 +1100, Brian May wrote:
> Brian May  writes:
> 
> > git read-tree --reset -u upstream
> > git reset -- debian
> > git checkout debian
> > git rm debian/.git-dpm
> 
> I have tried these steps on python-mkdocs in the debian/experimental
> branch, and then upgraded to the latest upstream (using instructions on
> wiki). Works perfectly[1].
> 
> The only unexpected problem I had is that "gbp import-orig --uscan", by
> default, switches to the master branch and attempts to merge the new
> upstream there. Which wasn't going to work, because master still is the
> patches-applied git-dpm version. I had assumed that it would work on the
> current branch; it doesn't.

You can override the target debian / upstream branches with `gbp
import-orig --debian-branch=debian/experimental --upstream-
branch=upstream/latest`.

Long-term you'd want to write your DEP-14 compliant configuration in
debian/gbp.conf indeed.

Ghis



Re: Under which umbrella should I package my Python IDE for beginners (Thonny)?

2017-03-09 Thread Ghislain Vaillant
On Thu, 2017-03-09 at 09:58 +0200, Aivar Annamaa wrote:
> Hi!
> 
> I've developed a Python IDE for beginners, Thonny (http://thonny.org) 
> and I intend to package it for Debian 
> (https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=857042).

Nice.

> The application would consist of a Python 3 package named "thonny" 
> (https://pypi.python.org/pypi/thonny), a simple launch script (python3 
> -m thonny), desktop file and icon. It depends only on Python 3.4 or 
> later with Tkinter.

You've got the dependency chain covered.

> For the needs of the main application, the Python package could be 
> private for the application, but as 3rd party Thonny plugins may need to 
> import something from there, it makes sense to treat it as a shared 
> Python package.

Ack.

> By the example of similar app Spyder 
> (https://packages.debian.org/sid/spyder3), I thought that it makes sense 
> to create two binary packages: "python3-thonny" for the Python package 
> and "thonny" for providing end user facilities. Would you agree with 
> this plan?

Sounds ok.

> If I want to publish Thonny under one of the Python teams, which one 
> should I choose? (My own preference would be Python Modules, because I'm 
> more comfortable with git than with svn).

You can use git with PAPT too.

> If it's good idea to publish it under Python Modules Team, can you 
> please add me to the team (my Alioth user is aivarannamaa-guest)? I've 
> read the policy document and agree with it.
> 
> Is anyone willing to sponsor the package?

You can start working on the packaging on a private repository whilst
waiting for being accepted to a team (PAPT, DPMT or others). You will
need a functional package before seeking sponsorship.

Cheers,
Ghis



RFS: pytest-qt/2.1.0-1

2017-02-11 Thread Ghislain Vaillant
Package: sponsorship-requests
Severity: wishlist

Dear mentors,

I am looking for a sponsor for my package "pytest-qt"

* Package name: pytest-qt
  Version : 2.1.0-1
  Upstream Author : Bruno Oliveira
* URL : https://github.com/pytest-dev/pytest-qt
* License : Expat
  Section : python

It builds those binary packages:

  python-pytestqt-doc - documentation for pytest-qt
  python3-pytestqt - pytest plugin for Qt application testing (Python 3)

To access further information about this package, please visit the
following URL:

  https://mentors.debian.net/package/pytest-qt

Alternatively, one can download the package with dget using this
command:

  dget -x 
https://mentors.debian.net/debian/pool/main/p/pytest-qt/pytest-qt_2.1.0-1.dsc

Or checkout the packaging repository at:

  https://anonscm.debian.org/git/python-modules/packages/pytest-qt.git

Changes since the last upload:

  * Initial release. (Closes: #854365)

Regards,
Ghislain Vaillant



Re: Re: Naming convention for -doc package

2017-02-10 Thread Ghislain Vaillant
[Piotr Ożarowski]
> > For instance, I have a source package (pytest-qt) which builds a Python
> > 3 binary package and its corresponding documentation. Right now, they
> > are respectively named python3-pytest-qt and pytest-qt-doc.
> 
> I'd use python-modulename-doc even for new packages that provide
> python3-modulename binary package only

Ok.

> BTW, it's pytestqt, not pytest-qt so binary package name for Python 3
> should be python3-pytestqt (source name: pytest-qt)

Considering pytest plugins aren't meant to be used directly, but by
pytest via the registered entry-point, using "pytestqt" over "pytest-
qt" for the binary package sounded unnecessary to me.

And yes, I do know there is a policy and I follow it closely. In this
particular case however, we would be breaking the consistency between
the naming of the other pytest plugins for no obvious benefit to me.

> > Shall we keep the current python- prefix (as per Python the language,
> > not Python 2 the version)
> 
> that would be my pick

So given your criteria above, you would choose:

- python3-pytestqt
- python-pytestqt-doc

Am I correct?

Is everyone happy with that?

Cheers,
Ghis



Re: Naming convention for -doc package

2017-02-10 Thread Ghislain Vaillant
On Thu, 2017-02-09 at 18:58 -0500, Sandro Tosi wrote:
> On Thu, Feb 9, 2017 at 5:40 PM, Ghislain Vaillant <ghisv...@gmail.com> wrote:
> > On Thu, 2017-02-09 at 16:51 -0500, Sandro Tosi wrote:
> > > On Thu, Feb 9, 2017 at 3:17 PM, Ghislain Vaillant <ghisv...@gmail.com> 
> > > wrote:
> > > > Now that new packages are targeting the Buster cycle, and that Python 2
> > > > packages should no longer be built,
> > > 
> > > this is news to me, can you point me to where this was announced?
> > 
> > Announced, I don't know. But:
> > 
> > https://lintian.debian.org/tags/new-package-should-not-package-python2-module.html
> > 
> > Unless I am missing something?
> 
> this was triggered by
> https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=829744 -- sigh

Thanks for finding it out. So based on #829744, both pytest-qt and
pytest-xvfb, which are new packages, do not produce a corresponding
Python 2 binary package.

Back to the original question, what about the naming for -doc packages?

Ghis



Re: Naming convention for -doc package

2017-02-09 Thread Ghislain Vaillant
On Thu, 2017-02-09 at 16:51 -0500, Sandro Tosi wrote:
> On Thu, Feb 9, 2017 at 3:17 PM, Ghislain Vaillant <ghisv...@gmail.com> wrote:
> > Now that new packages are targeting the Buster cycle, and that Python 2
> > packages should no longer be built,
> 
> this is news to me, can you point me to where this was announced?

Announced, I don't know. But:

https://lintian.debian.org/tags/new-package-should-not-package-python2-module.html

Unless I am missing something?



Naming convention for -doc package

2017-02-09 Thread Ghislain Vaillant
Just to get the opinion from the team,

Now that new packages are targeting the Buster cycle, and that Python 2
packages should no longer be built, how should the corresponding -doc
packages be named?

For instance, I have a source package (pytest-qt) which builds a Python
3 binary package and its corresponding documentation. Right now, they
are respectively named python3-pytest-qt and pytest-qt-doc.

Shall we keep the current python- prefix (as per Python the language,
not Python 2 the version), use a python3- prefix, or drop the prefix
(as I temporarily did)?

Thought I'd better ask than be sorry later.

Cheers,
Ghis



Re: Re: Moving off of git-dpm (Re: git-dpm breakage src:faker)

2017-02-07 Thread Ghislain Vaillant
I know the discussion is leaning towards replacing usage of git-dpm
with gbp-pq. I have nothing against it but, since we are talking about
solutions for a git-centric workflow, has anyone considered the dgit-
maint-merge workflow [1]?

It is very well documented and would simplify team-packaging policies a
good deal. Assuming dgit-maint-merge were widely adopted, packaging
policies would only need to cover team-specific details, such as
infrastructure or communication channels for sponsorship, and then
reference the dgit-maint-merge manpages for the packaging workflow.

[1] https://www.mankier.com/7/dgit-maint-merge

Best regards,
Ghis



Re: Team upload for python-jedi

2017-01-21 Thread Ghislain Vaillant
[Piotr Ożarowski]
FYI: I uploaded 0.10.0~git1+f05c071-1 (without running tests during
build for now)

Great. Too bad for the wasted efforts on my end then. *sigh*

As far as testing is concerned, I am forwarding the set of patches
which were necessary to get the tests working on top of 0.9.0.

I'll let you verify whether they still apply on the snapshot you
uploaded regardless. 


Also:

"Drop DPMT from Uploaders (due to problems with multiple tarballs in
git-dpm)"

Then, the package is no longer team-maintained?


Regards,
GhisFrom 4f2f807ca86b2102f720fe383e323bb613e18189 Mon Sep 17 00:00:00 2001
From: Dave Halter 
Date: Sat, 9 Jul 2016 17:27:57 +0200
Subject: An empty path given to Jedi should not raise errors. Fixes #577.

---
 test/test_integration_import.py | 10 +-
 1 file changed, 5 insertions(+), 5 deletions(-)

diff --git a/test/test_integration_import.py b/test/test_integration_import.py
index 6b5ad73a..d961666c 100644
--- a/test/test_integration_import.py
+++ b/test/test_integration_import.py
@@ -18,17 +18,17 @@ def test_goto_definition_on_import():
 def test_complete_on_empty_import():
 assert Script("from datetime import").completions()[0].name == 'import'
 # should just list the files in the directory
-assert 10 < len(Script("from .", path='').completions()) < 30
+assert 10 < len(Script("from .", path='whatever.py').completions()) < 30
 
 # Global import
-assert len(Script("from . import", 1, 5, '').completions()) > 30
+assert len(Script("from . import", 1, 5, 'whatever.py').completions()) > 30
 # relative import
-assert 10 < len(Script("from . import", 1, 6, '').completions()) < 30
+assert 10 < len(Script("from . import", 1, 6, 'whatever.py').completions()) < 30
 
 # Global import
-assert len(Script("from . import classes", 1, 5, '').completions()) > 30
+assert len(Script("from . import classes", 1, 5, 'whatever.py').completions()) > 30
 # relative import
-assert 10 < len(Script("from . import classes", 1, 6, '').completions()) < 30
+assert 10 < len(Script("from . import classes", 1, 6, 'whatever.py').completions()) < 30
 
 wanted = set(['ImportError', 'import', 'ImportWarning'])
 assert set([c.name for c in Script("import").completions()]) == wanted
From faf18bdf335da4384f321886dfa2167525d65e51 Mon Sep 17 00:00:00 2001
From: Sid Shanker 
Date: Sun, 17 May 2015 23:11:02 -0700
Subject: Fixed utf-8 decoding error in build.

---
 test/run.py | 9 ++---
 1 file changed, 6 insertions(+), 3 deletions(-)

diff --git a/test/run.py b/test/run.py
index a48e1fb2..d7309143 100755
--- a/test/run.py
+++ b/test/run.py
@@ -290,9 +290,12 @@ def collect_dir_tests(base_dir, test_files, check_thirdparty=False):
 skip = 'Thirdparty-Library %s not found.' % lib
 
 path = os.path.join(base_dir, f_name)
-source = open(path).read()
-if not is_py3:
-source = unicode(source, 'UTF-8')
+
+if is_py3:
+source = open(path, encoding='utf-8').read()
+else:
+source = unicode(open(path).read(), 'UTF-8')
+
 for case in collect_file_tests(StringIO(source),
lines_to_execute):
 case.path = path


Re: Team upload for python-jedi

2017-01-19 Thread Ghislain Vaillant
On Thu, 2017-01-19 at 20:45 +0300, Dmitry Shachnev wrote:
> Hi Ghislain,
> 
> On Wed, Jan 18, 2017 at 05:53:05PM +, Ghislain Vaillant wrote:
> > Ok, I have got a working package fixing the RC. However, whoever did
> > the migration from svn to git forgot that the source tree was made of
> > multiple tarballs (one for jedi, one for jedi-vim) and now the vim
> > plugin package cannot be produced because of the missing sources [1].
> > 
> > How I should proceed now?
> > 
> > We could just drop the vim plugin package for now (it does not work
> > anyway due to #841043), and consider introducing a new source package
> > for it later. Afterall, they are separate projects on GitHub [2, 3].
> > 
> > Otherwise, I guess the svn migration would have to be re-run? I have no
> > idea how to do it, nor setting up git-dpm to use multiple tarballs.
> 
> The SVN to Git migration was done automatically. I think the script was
> just not too smart to deal with multiple tarballs properly.

Ok.


> For now you can just forget about git-dpm, get the sources manually and
> copy the Debian directory from Git on top of them.

Just to be sure, do you mean I should leave the repository alone and
merge my work in a fresh import-dsc of the current package?


> Now it is too late to fix the DPMT migration script, but it may be not
> too late to make sure the problem does not appear with PAPT.

Possibly.


Ghis



Re: Team upload for python-jedi

2017-01-19 Thread Ghislain Vaillant
I apologize for insisting here, but I need this RC fixed ASAP for
another package relying on it and have no idea what to do now.

Thanks,
Ghis


On Wed, 2017-01-18 at 17:53 +, Ghislain Vaillant wrote:
> On Wed, 2017-01-18 at 13:37 +0100, Piotr Ożarowski wrote:
> > Hi,
> > 
> > [Ghislain Vaillant, 2017-01-18]
> > > Would you be ok if I push the changes required to fix #830399 and
> > > #841043 for src:python-jedi and prepare a team-upload?
> > > 
> > > I need the RC fixed for the packaging of spyder.
> > 
> > go ahead. I have almost working package with latest changes from upstream
> > git repo but some tests still fail and I don't have time to work on it
> > right now.
> 
> Ok, I have got a working package fixing the RC. However, whoever did
> the migration from svn to git forgot that the source tree was made of
> multiple tarballs (one for jedi, one for jedi-vim) and now the vim
> plugin package cannot be produced because of the missing sources [1].
> 
> [1] 
> https://anonscm.debian.org/cgit/python-modules/packages/python-jedi.git/tree/
> 
> 
> How I should proceed now?
> 
> We could just drop the vim plugin package for now (it does not work
> anyway due to #841043), and consider introducing a new source package
> for it later. Afterall, they are separate projects on GitHub [2, 3].
> 
> Otherwise, I guess the svn migration would have to be re-run? I have no
> idea how to do it, nor setting up git-dpm to use multiple tarballs.
> 
> [2] https://github.com/davidhalter/jedi
> [3] https://github.com/davidhalter/jedi-vim
> 
> 
> Cheers,
> Ghis



Re: RFS: python-cartopy/0.14.2-2

2017-01-19 Thread Ghislain Vaillant
Forwarding to d-python, who might know of a solution to this issue.

The gist of it is that we are looking for a suitable solution to bypass
a specific test which is failing on a subset of 32-bit architectures
(i386 only).

My initial patch was quite blunt and disabled the test for any 32-bit
arch, and James proposed a different solution based on a subprocess
call to dpkg-architecture. Do you guys have any suggestion of a more
Pythonic way to achieve this?

Cheers,
Ghis


On Tue, 2017-01-17 at 23:36 +, Ghislain Vaillant wrote:
> On Tue, 2017-01-17 at 20:39 +, James Clarke wrote:
> > On Tue, Jan 17, 2017 at 07:59:25PM +, Ghislain Vaillant wrote:
> > > Hi James,
> > > 
> > > Le 17 janv. 2017 7:44 PM, "James Clarke" <jrt...@debian.org> a écrit :
> > > 
> > > On Tue, Jan 17, 2017 at 07:22:32PM +, Ghislain Vaillant wrote:
> > > > Dear all,
> > > > 
> > > > I am looking for a sponsor for python-cartopy:
> > > > 
> > > >   https://anonscm.debian.org/cgit/debian-science/packages/
> > > 
> > > python-cartopy.git
> > > > 
> > > > 
> > > > Changes since the last upload:
> > > > 
> > > >   [ Andreas Tille ]
> > > >   * Do not force the package maintainer to install build-dependencies 
> > > > like
> > > > geos on local machine
> > > > 
> > > >   [ Ghislain Antony Vaillant ]
> > > >   * Fix FTBFS on 32-bit architectures (Closes: #848634)
> > > > New patch 0001-Skip-tests-failing-on-32-bit-architectures.patch
> > > > 
> > > > 
> > > > The test skip patch is a result of upstream being not responsive to the
> > > > issue so far. Since the failing test is fairly minor, I see no reason
> > > > to block the package for non 32-bit architectures.
> > > 
> > > This doesn't look right; it only failed on i386 (and
> > > hurd/kfreebsd-i386), but it built on armel, armhf, mips, mipsel, hppa,
> > > m68k and powerpc. This is not a 32-bit issue, but an x86 floating-point
> > > issue, most likely because the x87 FPU uses 80-bit floats internally.
> > > 
> > > 
> > > Would you have a better patch to propose then?
> > > 
> > > If you're going to disable the test, please do so *only* if the CPU is
> > > i386.
> > > 
> > > 
> > > I only know how to test for 32-bit, not for *i386 specifically, in Python.
> > > Please advise if you do.
> > 
> > There are platform.architecture()/machine()/processor(), but processor()
> > seems to give '' on Debian, machine() will give x86_64 if using an i386
> > chroot on an amd64 host, and architecture() just gives ('32bit', '').
> 
> Not to mention that platform.architecture is notoriously unreliable to
> detect bitness from my initial reading (Python docs and SO).
> 
> > Then there's the additional complication that x32 is a 32-bit version of
> > amd64, so it shouldn't fail the test. Therefore, I would propose
> > something like the following Debian-specific hack:
> > 
> > host_cpu = subprocess.check_output(['dpkg-architecture', 
> > '-qDEB_HOST_ARCH_CPU'])
> > if host_cpu != 'i386':
> > # Do the assert
> 
> Is it really worth going to so much trouble for just one test, you
> reckon? I understand your rationales for it and appreciate the
> pointers, but my pragmatic self is wondering whether putting such hack
> in place would be worth the reward from a maintenance perspective.
> 
> > I would be interested to know if there's a way of doing this without
> > having to call dpkg-architecture.
> 
> I would prefer such a solution too, TBH.
> 
> Ghis



Re: Team upload for python-jedi

2017-01-18 Thread Ghislain Vaillant
On Wed, 2017-01-18 at 13:37 +0100, Piotr Ożarowski wrote:
> Hi,
> 
> [Ghislain Vaillant, 2017-01-18]
> > Would you be ok if I push the changes required to fix #830399 and
> > #841043 for src:python-jedi and prepare a team-upload?
> > 
> > I need the RC fixed for the packaging of spyder.
> 
> go ahead. I have almost working package with latest changes from upstream
> git repo but some tests still fail and I don't have time to work on it
> right now.

Ok, I have got a working package fixing the RC. However, whoever did
the migration from svn to git forgot that the source tree was made of
multiple tarballs (one for jedi, one for jedi-vim) and now the vim
plugin package cannot be produced because of the missing sources [1].

[1] 
https://anonscm.debian.org/cgit/python-modules/packages/python-jedi.git/tree/


How I should proceed now?

We could just drop the vim plugin package for now (it does not work
anyway due to #841043), and consider introducing a new source package
for it later. Afterall, they are separate projects on GitHub [2, 3].

Otherwise, I guess the svn migration would have to be re-run? I have no
idea how to do it, nor setting up git-dpm to use multiple tarballs.

[2] https://github.com/davidhalter/jedi
[3] https://github.com/davidhalter/jedi-vim


Cheers,
Ghis



Team upload for python-jedi

2017-01-18 Thread Ghislain Vaillant
Hi Piotr,

Would you be ok if I push the changes required to fix #830399 and
#841043 for src:python-jedi and prepare a team-upload?

I need the RC fixed for the packaging of spyder.

Cheers,
Ghis



status of python-freetype packaging

2016-06-13 Thread Ghislain Vaillant

Hi Daniel,

Any news on this? Do you need any help?

Ghis



Re: help2man usage with pybuild / debhelper packaging workflow

2016-06-04 Thread Ghislain Vaillant

On 04/06/16 05:25, Paul Wise wrote:

On Sat, Jun 4, 2016 at 12:30 AM, Christian Seiler wrote:


Well, you could add a custom target to debian/rules that calls
help2man for all these scripts - so that you as a maintainer
can refresh the manpages every now and then. (And store them
in debian/ in the packaging.)  That way, you don't break cross
builds (manpages are pre-generated), but still automate it to
a large extent.


I don't think it is ever appropriate to store pre-generated files in
source packages, neither in the upstream tarball nor in the Debian
tarball (except for autotools cruft or VCS metadata through
autorevision).

For Python stuff, it is generally arch all and never needs to be
cross-built so help2man is fine.


That is my case, indeed.


In any case, using something like sphinx and sphinxcontrib-autoprogram
or python3-sphinx-argparse plus manual page source in Markdown or
reStructuredText format is a better way to go since you get a nice
format to write in and automatically sync your --help output with the
manual page.


I like this approach. Any example you may have in mind?

Cheers,
Ghis



Re: help2man usage with pybuild / debhelper packaging workflow

2016-06-03 Thread Ghislain Vaillant

On 03/06/16 17:59, Wookey wrote:

On 2016-06-03 18:30 +0200, Christian Seiler wrote:

On 06/03/2016 06:25 PM, Ghislain Vaillant wrote:



And I don't mind for a handful of scripts. But what if you have 20 or
30?


Well, you could add a custom target to debian/rules that calls
help2man for all these scripts - so that you as a maintainer
can refresh the manpages every now and then. (And store them
in debian/ in the packaging.)  That way, you don't break cross
builds (manpages are pre-generated), but still automate it to
a large extent.


Yep, that's a reasonable plan.


Ack.


Missing man pages is better than things that gratuitously won't
cross-build.


So it is clearly a no-no. I guess the wiki page is here to give a
starting point to write manpages, not automate the process of generating
them.


Just copying the help into a man page is mostly makework
to shut lintian up. They are often very poor manpages. If the help is
there already or upstream documentation is some other format
(e.g. html), then if you _really_ can't be bothered making man pages,
just leave it as upstream supplied. Using help2man is a worse
'solution' than doing nothing.


In this case, should I leave the lintian warning on, or override it
with a comment explaining why?

Ghis



Re: help2man usage with pybuild / debhelper packaging workflow

2016-06-03 Thread Ghislain Vaillant

On 03/06/16 17:10, Wookey wrote:

On 2016-06-03 16:43 +0100, Ghislain Vaillant wrote:

Dear all,

Are there any successful examples of integration of help2man with a
pybuild / debhelper workflow for an arbitrary number of scripts?


help2man breaks cross-building so is best avoided if you can.
Please just write a man page.


Yes, it is mentioned here [1].

[1] https://wiki.debian.org/ManPage/help2man

And I don't mind for a handful of scripts. But what if you have 20 or
30?

Ghis



help2man usage with pybuild / debhelper packaging workflow

2016-06-03 Thread Ghislain Vaillant

Dear all,

Are there any successful examples of integration of help2man with a
pybuild / debhelper workflow for an arbitrary number of scripts?

The only close example I could find was the stdeb package, but I am
dealing with many more scripts and I cannot afford to list them all
individually by hand.

Best regards,
Ghis



Re: Maintenance of pydap

2016-06-02 Thread Ghislain Vaillant

Hi Sandro,

I engaged with the pydap community and I can confirm that the project
has moved on from the old 2.x API. Since pydap 3.x is now using a new
namespace (pydap instead of dap), then both package versions should be
co-installable.

Since I need the pydap 3.x API for hdf-compass, I propose to introduce
a new source package for it (which I am happy to maintain) and keep the
old source package alive until all reverse dependencies relying on 2.x
eventually move on.

Hoe does that sound?

Ghis

On 28/01/16 09:50, Ghislain Vaillant wrote:

I believe upstream has moved on from the API of 2.x (dap namespace) to
using 3.x (pydap namespace). My assumption is based on the facts that:

1) the upstream repository [1] provide tags for 3.x releases only,
2) active issues, such as Python 3 support, are milestoned for 3.2.x [2],

[1] https://github.com/robertodealmeida/pydap
[2] https://github.com/robertodealmeida/pydap/issues/7

Since dap 2.x is no longer actively maintained, then the question is
whether to introduce a new source package for pydap 3.x or just upgrade
the current source package to 3.x and take care of the rdep for
python-mpltoolkits.basemap.

Thoughts?
Ghis


On 28/01/16 09:26, Sandro Tosi wrote:

for a start, you can try to get upstream to reply to this (old)
question: https://groups.google.com/forum/#!topic/pydap/6A6q-Y6qwtE

On Thu, Jan 28, 2016 at 9:24 AM, Ghislain Vaillant
<ghisv...@gmail.com> wrote:

Forwarding to d-python, as it seems like a more appropriate place to
discuss this matter.


 Forwarded Message 
Subject: Maintenance of pydap
Date: Mon, 25 Jan 2016 08:00:32 +
From: Ghislain Vaillant <ghisv...@gmail.com>
To: mo...@debian.org
CC: python-modules-t...@lists.alioth.debian.org

Ciao Sandro,

I recently got involved with the packaging of the HDF compass [1], a
viewer for HDF5 written in Python and using pydap as an install
dependency.

The version of pydap currently packaged is too old to work with the
compass, first because version 3.x apparently changed the module API
from using `import dap` to `import pydap`.

I saw the following bug report [2] requesting an update of the package,
but it was not acted upon or acknowledged. I am thereby contacting you
to know whether the package was still actively maintained on your end.

If you have lost interest in maintaining this package, myself and the
Debian Science Team would be interested in taking over, since both
hdf-compass and pydap are science related.

Otherwise, please allow me to request an update to this package so I
can carry on with my packaging work on hdf-compass.

[1] https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=812434
[2] https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=788508

Many thanks,
Ghis








Re: Bug in pybuild's handling of --install-lib? [Was: Re: entry-point script and private module install directory]

2016-05-12 Thread Ghislain Vaillant

On 12/05/16 13:16, Piotr Ożarowski wrote:

[Ghislain Vaillant, 2016-05-12]

Is this a bug in pybuild or am I missing something?


you're missing my second reply¹

[¹] https://lists.debian.org/debian-python/2016/05/msg00043.html


Indeed, sorry for the noise.

Thanks again for the support.

Ghis



Bug in pybuild's handling of --install-lib? [Was: Re: entry-point script and private module install directory]

2016-05-12 Thread Ghislain Vaillant

On 12/05/16 09:22, Ghislain Vaillant wrote:

On 11/05/16 18:55, Piotr Ożarowski wrote:

you can create a wrapper or patch /usr/bin script to
sys.path.append('/usr/share/pyfr') but the easiest solution is to
install the script to /usr/share/pyfr/ (if the module is "pyfr" as well,
simply rename the script to "run" or any other name) and then symlink it
to /usr/bin/pyfr)

   override_dh_auto_install:
dh_auto_install -- --install-lib=/usr/share/pyfr/
mv debian/pyfr/usr/bin/pyfr debian/pyfr/usr/share/pyfr/run

and add "/usr/share/pyfr/run /usr/bin/pyfr" to debian/pyfr.links


Thanks Piotr, that's the solution I have been looking for.

Ghis


Actually, on second look, Piotr's solution currently fails:

d/rules:

export PYBUILD_DESTDIR=$(CURDIR)/debian/tmp
[...]
override_dh_auto_install:
dh_auto_install -- --install-lib=/usr/share/pyfr
mv $(PYBUILD_DESTDIR)/usr/bin/pyfr $(PYBUILD_DESTDIR)/usr/share/pyfr/run


build log:

dh_auto_install -- --install-lib=/usr/share/pyfr
install -d debian/pyfr
install -d debian/pyfr-doc
	pybuild --install -i python{version} -p 3.5 
--install-lib=/usr/share/pyfr --dir . --dest-dir /<>/debian/tmp

usage: pybuild [ACTION] [BUILD SYSTEM ARGS] [DIRECTORIES] [OPTIONS]
pybuild: error: unrecognized arguments: --install-lib=/usr/share/pyfr


Whereas this succeeds:

d/rules:

export PYBUILD_DESTDIR=$(CURDIR)/debian/tmp
export PYBUILD_INSTALL_ARGS=--install-lib=/usr/share/pyfr
[...]
override_dh_auto_install:
dh_auto_install
mv $(PYBUILD_DESTDIR)/usr/bin/pyfr 
$(PYBUILD_DESTDIR)/usr/share/pyfr/run



build log:

dh_auto_install
install -d debian/pyfr
install -d debian/pyfr-doc
	pybuild --install -i python{version} -p 3.5 --dir . --dest-dir 
/<>/debian/tmp
I: pybuild base:184: /usr/bin/python3 setup.py install --root 
/<>/debian/tmp --install-lib=/usr/share/pyfr

running install
running build
running build_py
running install_lib
[...]


Is this a bug in pybuild or am I missing something?



Re: entry-point script and private module install directory

2016-05-12 Thread Ghislain Vaillant

On 11/05/16 18:55, Piotr Ożarowski wrote:

[Ghislain Vaillant, 2016-05-11]

Dear all,

I have a package (pyfr), which is meant to be used as a command-line
application only.

The main script (pyfr) is installed via setuptools'
entry_points['console_scripts'], which generates the entry-point
automatically and places it under /usr/bin. However, when I install the
implementation module in a private location, such as /usr/share/pyfr,
the entry-point cannot find the module and load the application.

Right now, the module is installed in the dist-packages location,
although it is not intended to be public. It was just the easiest
solution to start with at the time.

What is the standard way to circumvent this? Do other packages use custom


you can create a wrapper or patch /usr/bin script to
sys.path.append('/usr/share/pyfr') but the easiest solution is to
install the script to /usr/share/pyfr/ (if the module is "pyfr" as well,
simply rename the script to "run" or any other name) and then symlink it
to /usr/bin/pyfr)

   override_dh_auto_install:
dh_auto_install -- --install-lib=/usr/share/pyfr/
mv debian/pyfr/usr/bin/pyfr debian/pyfr/usr/share/pyfr/run

and add "/usr/share/pyfr/run /usr/bin/pyfr" to debian/pyfr.links


Thanks Piotr, that's the solution I have been looking for.

Ghis



entry-point script and private module install directory

2016-05-11 Thread Ghislain Vaillant

Dear all,

I have a package (pyfr), which is meant to be used as a command-line
application only.

The main script (pyfr) is installed via setuptools'
entry_points['console_scripts'], which generates the entry-point
automatically and places it under /usr/bin. However, when I install the
implementation module in a private location, such as /usr/share/pyfr,
the entry-point cannot find the module and load the application.

Right now, the module is installed in the dist-packages location,
although it is not intended to be public. It was just the easiest
solution to start with at the time.

What is the standard way to circumvent this? Do other packages use 
custom wrapper scripts, or is there a more clever way to tell the

entry-point to look for a custom package location? Is there a good
example you guys can point me to?

There was a similar discussion for the grip [1] package on this list,
but it did not lead to something I can use, or so I believe.

[1] https://lists.debian.org/debian-python/2016/04/msg1.html

Many thanks,
Ghis



Re: Request to join the DPMT

2016-02-20 Thread Ghislain Vaillant

Bumping this and wishing someone with administrative rights could
answer my request to join the team. Thanks.

Ghislain



Request to join the DPMT

2016-02-14 Thread Ghislain Vaillant

Hello DPMT members,

I would like to join the DPMT. I am essentially involved in the Debian
Science Team, but I am also working with Python dependencies which are
more generic and would be better suited for team-maintenance here.

My alioth account is ghisvail-guest. I did request to join via Alioth
but never got a response, hence this request on the mailing-list.

I have read and accept the DPMT's policy.

Cheers,
Ghislain



Fwd: Maintenance of pydap

2016-01-28 Thread Ghislain Vaillant

Forwarding to d-python, as it seems like a more appropriate place to
discuss this matter.

 Forwarded Message 
Subject: Maintenance of pydap
Date: Mon, 25 Jan 2016 08:00:32 +
From: Ghislain Vaillant <ghisv...@gmail.com>
To: mo...@debian.org
CC: python-modules-t...@lists.alioth.debian.org

Ciao Sandro,

I recently got involved with the packaging of the HDF compass [1], a
viewer for HDF5 written in Python and using pydap as an install
dependency.

The version of pydap currently packaged is too old to work with the
compass, first because version 3.x apparently changed the module API
from using `import dap` to `import pydap`.

I saw the following bug report [2] requesting an update of the package,
but it was not acted upon or acknowledged. I am thereby contacting you
to know whether the package was still actively maintained on your end.

If you have lost interest in maintaining this package, myself and the
Debian Science Team would be interested in taking over, since both
hdf-compass and pydap are science related.

Otherwise, please allow me to request an update to this package so I
can carry on with my packaging work on hdf-compass.

[1] https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=812434
[2] https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=788508

Many thanks,
Ghis



Re: Maintenance of pydap

2016-01-28 Thread Ghislain Vaillant
I believe upstream has moved on from the API of 2.x (dap namespace) to 
using 3.x (pydap namespace). My assumption is based on the facts that:


1) the upstream repository [1] provide tags for 3.x releases only,
2) active issues, such as Python 3 support, are milestoned for 3.2.x [2],

[1] https://github.com/robertodealmeida/pydap
[2] https://github.com/robertodealmeida/pydap/issues/7

Since dap 2.x is no longer actively maintained, then the question is 
whether to introduce a new source package for pydap 3.x or just upgrade 
the current source package to 3.x and take care of the rdep for 
python-mpltoolkits.basemap.


Thoughts?
Ghis


On 28/01/16 09:26, Sandro Tosi wrote:

for a start, you can try to get upstream to reply to this (old)
question: https://groups.google.com/forum/#!topic/pydap/6A6q-Y6qwtE

On Thu, Jan 28, 2016 at 9:24 AM, Ghislain Vaillant <ghisv...@gmail.com> wrote:

Forwarding to d-python, as it seems like a more appropriate place to
discuss this matter.


 Forwarded Message 
Subject: Maintenance of pydap
Date: Mon, 25 Jan 2016 08:00:32 +
From: Ghislain Vaillant <ghisv...@gmail.com>
To: mo...@debian.org
CC: python-modules-t...@lists.alioth.debian.org

Ciao Sandro,

I recently got involved with the packaging of the HDF compass [1], a
viewer for HDF5 written in Python and using pydap as an install
dependency.

The version of pydap currently packaged is too old to work with the
compass, first because version 3.x apparently changed the module API
from using `import dap` to `import pydap`.

I saw the following bug report [2] requesting an update of the package,
but it was not acted upon or acknowledged. I am thereby contacting you
to know whether the package was still actively maintained on your end.

If you have lost interest in maintaining this package, myself and the
Debian Science Team would be interested in taking over, since both
hdf-compass and pydap are science related.

Otherwise, please allow me to request an update to this package so I
can carry on with my packaging work on hdf-compass.

[1] https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=812434
[2] https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=788508

Many thanks,
Ghis








Re: How to use Debian packaged lapack in python-miso

2015-02-16 Thread Ghislain Vaillant
Hi Andreas,

Could you provide an http link to the upstream sources please ?

Cheers,


2015-02-16 13:59 GMT+00:00 Andreas Tille andr...@an3as.eu:

 Hi,

 I intent to package python MISO on behalf of the Debian Med team.  The
 packaging
 code is in SVN at

   Vcs-Svn: svn://
 anonscm.debian.org/debian-med/trunk/packages/python-miso/trunk/

 It builds and runs the test suite but I'd like to drop the lapack code
 copy.

 I have not found any chance to convince setup.py to simply link agains
 the dynamic library in Debian.  It simply compiles all
 pysplicing/src/lapack/*.c files and links all remaining *.o files.  Any
 idea
 how to use the Debian packaged library?

 Kind regards

Andreas.

 --
 http://fam-tille.de


 --
 To UNSUBSCRIBE, email to debian-mentors-requ...@lists.debian.org
 with a subject of unsubscribe. Trouble? Contact
 listmas...@lists.debian.org
 Archive: https://lists.debian.org/20150216135905.gc23...@an3as.eu




Re: Fwd: [iva] - Python 3 code depends on pysam

2014-11-24 Thread Ghislain Vaillant
At worst, can't you just disable the test suite for the Python 3 builds ?
Pybuild should allow to do that easily.

Ghis

2014-11-24 16:36 GMT+00:00 Jorge Sebastião Soares j.s.soa...@gmail.com:

 Hi guys,

 So essentially the package build halts when it tries to run the test suite:

 This is the error I'm getting when the pysam module is being imported:

 root@debian:~/iva-0.10.0# python3.4 setup.py test
 running test
 running egg_info
 writing top-level names to iva.egg-info/top_level.txt
 writing iva.egg-info/PKG-INFO
 writing dependency_links to iva.egg-info/dependency_links.txt
 reading manifest file 'iva.egg-info/SOURCES.txt'
 writing manifest file 'iva.egg-info/SOURCES.txt'
 running build_ext
 Failure: ImportError (No module named 'pysam') ... ERROR

 ==
 ERROR: Failure: ImportError (No module named 'pysam')
 --
 Traceback (most recent call last):
   File /usr/lib/python3/dist-packages/nose/failure.py, line 39, in
 runTest
 raise self.exc_val.with_traceback(self.tb)
   File /usr/lib/python3/dist-packages/nose/loader.py, line 414, in
 loadTestsFromName
 addr.filename, addr.module)
   File /usr/lib/python3/dist-packages/nose/importer.py, line 47, in
 importFromPath
 return self.importFromDir(dir_path, fqname)
   File /usr/lib/python3/dist-packages/nose/importer.py, line 94, in
 importFromDir
 mod = load_module(part_fqname, fh, filename, desc)
   File /usr/lib/python3.4/imp.py, line 245, in load_module
 return load_package(name, filename)
   File /usr/lib/python3.4/imp.py, line 217, in load_package
 return methods.load()
   File frozen importlib._bootstrap, line 1220, in load
   File frozen importlib._bootstrap, line 1200, in _load_unlocked
   File frozen importlib._bootstrap, line 1129, in _exec
   File frozen importlib._bootstrap, line 1471, in exec_module
   File frozen importlib._bootstrap, line 321, in
 _call_with_frames_removed
   File /tmp/buildd/iva-0.10.0/iva/__init__.py, line 20, in module
 from iva import *
   File /tmp/buildd/iva-0.10.0/iva/assembly.py, line 2, in module
 import pysam
 ImportError: No module named 'pysam'

 --
 Ran 1 test in 0.016s

 FAILED (errors=1)


 If pysam is python 3 compliant, I'm tempted to create the needed symlinks
 in python3.4 pointing to pysam in python2.7, eg.

 ln -s /usr/lib/python2.7/dist-packages/pysam
 /usr/lib/python3.4/dist-packages/pysam

 I'm sure that this is not the proper way of doing things, so is there any
 other way I can get pysam to be installed under python3.4 rather than
 python2.7?

 Regards,

 Jorge



ITP: pyrecord -- Pythonic record types

2014-11-18 Thread Ghislain Vaillant
Package: wnpp
Severity: wishlist
Owner: Ghislain Antony Vaillant

* Package name  : pyrecord
  Version : 1.0.0~rc1
  Upstream Author   : Gustavo Narea
* URL : https://pythonhosted.org/pyrecord/
* License: Apache-2.0
  Programming Lang: Python
  Description   : Pythonic record types

Long description taken from PyPI:
 A record (aka struct in C) is a pre-defined collection
 of values where each is accessed by a unique name. Depending
 on the nature of the data, records may be a superior
 alternative to dictionaries and instances of custom classes.
 .
 PyRecord allows you to use records in Python v2.7 to v3.x and
 PyPy v2, and can be thought of as an improved namedtuple.

Considering the nature of this package, it should probably be
maintained under the Debian Python Team umbrella.

Cheers,
- Ghis


packaging advice for cmake generated python wrapper

2014-01-26 Thread Ghislain Vaillant
Hi guys,

I am currently trying to package the ISMRMRD library (#732360), which
uses cmake for its build system and generates the c++ library and
corresponding Java and Python wrappers using SWIG.

I was wondering how that would play along with Debian packaging and what
would be the best approach for it. I am also in close relationship with
upstream and they are inclined to modify the existing cmake files to
ease support for Debian packaging.

With the current set-up, cmake generates both the java and Python
wrappers and the library in the same target folder, i.e. after make, all
resulting binaries are all stored in /usr/local/ismrmrd/lib/ which
includes libismrmrd.so (shlib), ismrmrd.jar (java wrapper), ismrmrd.so,
ismrmrd.py (python wrapper). Besides, cmake only builds the python
wrappers for Python 2 via SWIG.

I have packaged python-only projects before, using the convenience
offered by setup.py and setuptools. However for that case, I am a
clueless and did not find much documentation or similar case to study
on.

Any advice on packaging / fixing the build system for this library would
be warmly welcome.

Thanks,

Ghislain


-- 
To UNSUBSCRIBE, email to debian-python-requ...@lists.debian.org
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org
Archive: http://lists.debian.org/1390761407.1043.19.ca...@vaillant-lap.gipl