Le mercredi 15 mars 2017 17:59:23 UTC, anatoly techtonik a écrit : > > On Wed, Mar 15, 2017 at 11:27 AM, Ghislain Vaillant <[email protected] > <javascript:>> wrote: > > Le mardi 14 mars 2017 17:21:20 UTC, anatoly techtonik a écrit : > >> On Mon, Mar 13, 2017 at 7:11 PM, Ghislain Vaillant <[email protected]> > >> wrote: > >> > > >> > Early disclaimer: I am one of the Debian package maintainers for > spyder. > >> > >> Nice. I've create a recipe that builds Spyder automatically. > >> https://code.launchpad.net/~techtonik/+recipe/spyderlib-daily-master > >> > >> But.. it seems I got branches wrong. It builds from master > >> https://code.launchpad.net/~techtonik/spyderlib/master > >> merges with Debian packaging > >> https://code.launchpad.net/~techtonik/spyderlib/debian-packaging > >> and resulting binary has a version 3.1.2. > > > > You need to bump the version in debian/changelog accordingly. > > Can Debian packaging branches be identical to Spyder branches? > So that nightly/daily builds could be completely automatic. I'd gladly > move recipes and artifacts into some team PPA on Launchpad, because > I am subject of being hit by a bus way too often. >
How do you intend to automate this actually? Assuming no changes is required to the packaging files, all you have to do is fetch the latest tarball, import the debian folder on top of it, refresh the patches (using quilt), bump the changelog version (using dch) and run dpkg-buildpackage. >> I created another recipe to work directly with Git branches, but it > fails > >> to build fail at all. > >> https://code.launchpad.net/~techtonik/+recipe/spyderlib-daily > > > > That's why I'd recommend to use the backportpackage utility on the > Debian > > experimental > > or the Ubuntu Zesty package (which is sync'd from experimental). Debian > > experimental will > > receive all subsequent updates whilst the freeze is happening. > > PPA recipes don't allow arbitrary commands, so it won't run daily. Also I > am > not sure how that backportpackage can help - builds are failing because > Debian checks don't like Spyder sources > https://launchpadlibrarian.net/310888373/buildlog.txt.gz > <https://www.google.com/url?q=https%3A%2F%2Flaunchpadlibrarian.net%2F310888373%2Fbuildlog.txt.gz&sa=D&sntz=1&usg=AFQjCNFOtvThYbc7Xpzug_x7Y6WhMSjd7A> > > You'd have to give me more details about your workflow here. This sort of errors usually indicates a missmatch between the version of the upstream source and the recorded version in the Debian changelog. >> > Le dimanche 12 mars 2017 14:39:31 UTC, anatoly techtonik a écrit : > >> > > >> > It depends what definition of "stable" you mean, i.e. as bug-free or > as > >> > non-changing. > >> > > >> > Sure, each release of spyder introduces its lot of bug-fixes but also > >> > new > >> > features and enhancements which are likely to introduce regressions. > For > >> > instance, the recent breakage of code completion with Jedi comes to > my > >> > mind > >> > (thanks Carlos for fixing it for 3.1.4). > >> > >> Bug free. Regressions are not good, but it is better to have > regressions > >> and ability install earlier version temporarily than don't have an > ability > >> to install version without bugs. > > > > > > That's your opinion. Stable releases of Debian are focusing on being > > regression-free > > from one version to the next. Should you need to keep up with the latest > > updates of > > spyder, then there are better alternatives like installing spyder in a > venv > > or using Debian > > testing/unstable, at a higher price of system maintenance. > > Yes. Often Debian/Ubuntu ships with unusable version of software I am > interested in, so I usually get it from source and compile/patch myself. s/unusable/outdated. > Especially for project like Spyder - Python IDE written in Python, which > more or less assumes by its own unwritten guidelines that users are able > to patch and share their changes. I don't follow. That's not the case with Debian/Ubuntu, > because changes are not really shared. They wait until permission. > I don't follow either. > I remember the whole SourceForge source browser was broken, because > Debian decided to remove duplicated JQuery from tracker and when > tracker frontend updated to last version, Debian could not revert it. > Sometimes developers of the software know better what version is to use. > That's why I asked this thread. > There is a human element to maintaining packages, and as for all human-dependent tasks, mistakes can happen. I don't reckon the spyder package being in a similar state as the example you picked though, am I right? Debian policies, which are the same for everything, with no user/developer > feedback about real update cycle, no channels for alternative cycles. It > is > so outdated. That's your opinion. That being said, if your interest solely revolves around running the latest versions of your software at all time, then packaging is *not* the deployment solution you are looking for. As I mentioned earlier in this thread, it will be much more straightforward to use pip. I mean there is a lot of space for improvement. Just no money > to get to it. =) But whatever, I was once banned from Debian maintainers > - > critics it hard to get there. People like their ponies, and I am just > lucky to > be able to use free open source software, so let's leave it where it is. > =) > Packaging is done by volunteers and is an unrewarding task as far as public recognition is concerned. So I can understand why other Debian maintainers may have reacted strongly, when faced with such opinionated generalization as the ones you have kept making, TBH. >> > [2] > >> > > http://manpages.ubuntu.com/manpages/precise/man1/backportpackage.1.html > >> > >> Sorry. It is about 16.10, not 16.04. The point is to get 3.1.x > >> releases on Yakkety. > >> https://bugs.launchpad.net/ubuntu/+source/spyder/+bug/1672159 > > > > > > You could do both with backportpackage. There is an option to specify > the > > target > > distribution of the package. You may try both Xenial and Yakkety, > although > > the value > > of providing backports for Yakkety is arguable considering the short > support > > cycle of > > interim Ubuntu releases. > > Can you expand what do you mean by value? I am running Ubuntu 16.10, and > I need latest released stable Spyder, which is 3.1.4, so what is arguable > here? > You'd be maintaining a PPA until 17.04 is out (not so far away), and the latter will at least receive spyder 3.1.3, if not 3.1.4 if the release is out before the next Ubuntu freeze. Not sure whether this is worth the trouble, at least for interim (i.e. non LTS) releases of Ubuntu. If you're concerned about maintenance, then I am not interested in > maintaining > those manual backports as well. Based on your personal views about packaging,I am telling you, you should consider maintaining your instance of spyder in a venv and do regular pip install --upgrade when a new release is announced. You would be wasting your time with a PPA and growing more frustrated about package maintenance than you already are. If automation is set correctly, maintenance > doesn't matter, but the problem is that backportpackage can not be run by > daily > servers. > Another bold claim, and although I'd agree that some of the package maintenance task can be automated (such as tracking new upstream releases for instance), the rest often requires minimal but manual interventions (such as updating the list of dependencies, packaging the new ones missing, recording new copyright information...). Once again, since I suspect you do not care about any of this, a position I would totally respect, do yourself a favour and use a pip+venv or a conda based deployment solution. -- You received this message because you are subscribed to the Google Groups "spyder" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. Visit this group at https://groups.google.com/group/spyderlib. For more options, visit https://groups.google.com/d/optout.
