On 2018-11-11, 3:44 PM, "Edwardo Garcia"
mailto:wdgar...@gmail.com>> wrote:
On Fri, Nov 9, 2018 at 6:05 PM Barry Pollard
mailto:barry_poll...@hotmail.com>> wrote:
2/ it gives impression of immature and buggy software - this gives thoughts
towards alternatives, IRC shows many admins have no loyalty todays much of
todays software (well, windows fanbois excepted.
Massively disagree. Frequent release to me give the impression of an actively
maintained and evolving project. And there are a lot of changes in the HTTP
space (HTTP/2, move to encryption, increased awareness on security...etc.).
That's contrary to how most see it, I held off replying to this thread because
I wanted to put this out there, on our system admin group, which contains some
491 members from 29 countries, and includes some of the rather well known ISP
and Hosting providers that are a lot bigger than a few soho admins, we are
talking just one of them alone hosts over 1 million websites, as of midnight,
143 replied,with 141 said they do consider a release often approach bad and
have and will refuse to update unless a compelling in the wild creates reason
to. the other 2 respondents said they don't care either way.
The mentioned dovecot in an earlier post is a classic example, we have for over
10 years been annoyed at its releases most because its a rushed bugfix which
breaks something else and the cycle continues, we have members still using
versions several years old because they consider them more stable, its long
been a pet peave,
So I would air on the side of caution before this release often kreeps in.
You only have to look at the past few attempts (scrapped versions) to release
apache to see the dangers in rush rush rush attitude.
I think it’s dangerous to conflate speed with quality…these are two very
different things, and I believe they should be evaluated completely separately.
It’s possible to have frequent releases with good or bad quality, and it’s
possible to have quality releases on a rare or frequent schedule. I also think
we’ve all seen (or can easily find) open source projects that match any
combination of these two measures. So I don’t think it’s valid to conclude
that one will necessarily lead to the other.
An important principle of software development is that smaller changes are
easier to understand, which means impacts are easier to predict. This is true
even for a project as big and with as varied usage as httpd (although note that
“easier” != “easy”), and more frequent releases will, generally speaking,
contain fewer and smaller changes. When releases are months or years apart,
it’s too easy to start throwing in everything *and* the kitchen sink…but this
is more likely to lead to problems because you end up with bigger changes in
every release.
And note that “frequent” doesn’t have to mean “rush”…the opposite can in fact
be true. If a developer knows that there are many opportunities to deliver a
feature, they can feel less pressure to rush something into a release when it
isn’t ready…but if releases are months or years apart, that’s when developers
might rush things a little to get on the release train.
I think it’s important to look at each development community and their track
record, rather than making broad generalizations. A community that values both
schedule and quality will make smart choices like dropping a feature from a
release if it’s not ready for prime time. I’m a relative newcomer to this
list, but I’ve seen a few release cycles now, and I don’t think anyone is
trying to put out a rushed and incomplete product. And the “scrapped versions”
used above as an example of rushing are actually great examples of the
validation process doing its job and preventing delivery of releases that were
not ready…I think these are evidence of the community indeed erring on the side
of caution and not sacrificing quality to meet a schedule.
Just my $0.02. :-)