On Oct 19, 2007, at 5:12 PM, Chris Devers wrote:

On Fri, 19 Oct 2007, [EMAIL PROTECTED] wrote:

On Oct 19, 2007, at 2:51 AM, Chris Devers wrote:

On Fri, 19 Oct 2007, [EMAIL PROTECTED] wrote:

I can draw a picture for you: http://finkproject.org/

In which case, your real argument appears to be "the Fink people don't
seem to be doing what I need fast enough."

In which case, the response is "you should contribute to Fink then".

Duly noted. I would like to try to do a unifier, a front end that searches all the various porting systems (fink, macports, darwinports.com) and gets the latest version of a package.

[...] I, as a developer, should maintain the latest version of perl on
my machines. I give in!

Yes, if that's really what you need. I still think it isn't the end of
the world to just work with the bundled version of Perl (along, of
course, with whatever CPAN modules you need). It's not like 5.8.6 or
5.8.8 are such awful, archaic versions to work with in the first place.

So target the release version, or do like everyone else that's
concerned about this and install your own Perl. It's not hard to do,
and it's really not that different than how things are on Debian.

Yes it is. debian's packages are updated constantly, not just in point
releases. So if there is a problem a new package is made available
relatively quickly.

Maybe my Debian experience is too limited then, but this seems like a
slightly glossed over version of things to me.

The last time I spent a lot of time with debian (roughly 2003-2005), it
was still on 3.0/Woody. Yes, there was a constant stream of package
updates, but IIRC they were all security patches, critical bugfixes
(with a *really* conservative definition of "critical" -- merely
braindead usability brokenness never seemed to be worth patching), etc. It seems like most of the updates we were getting were via backports.org
rather than official updates to Woody itself.

Maybe things have evolved since then, but at the time it seemed like if an update wasn't for security or a real showstopping bug (e.g. keeps the
machine from booting, or a critical daemon from running), then it was
seen as a "mere features update" and got deferred until 3.1/Sarge. If
you wanted those "features" updates, you had to get them from backports
or roll your own. Maybe as a backlash, I seem to remember that this is
around when Ubuntu et al branched off to be a more current platform.

Things have changed significantly. As an example, we have a tool in the debian-perl group that compares our version of a perl module with the module on CPAN. This is automated and is done daily. (http://pkg- perl.alioth.debian.org/qa/versions.html) This way we can see which modules need updating and do the update as part of our normal team work keeping perl fresh in debian.

This seems like exactly the stance that we're talking about here, and as
frustrating as it can seem, there are really good reasons to do things
this way, not least being stability & predictability for developers, who
can assume confidently that release X is going to have Perl v.Y, etc.

As far as Ubuntu is concerned, they just take a snapshot of debian and work out the bugs, freeze the code, and release it on a planned release date. Since it is a frozen version of debian and Ubuntu quickly becomes outdated in comparison with debian unstable, though they do issue updates for security and other bugs which they get from debian or initiate themselves.

Stability is good, but elusive. Is a patched version less stable than an unpatched version? Most new versions of software are bug fixes of the same code that has been working anyway, maybe I shouldn't say "most" but we can agree it is many.

        Jeremiah

Reply via email to