I admire the recent addition to the manual on bulk package
updates. [1] The helper script for generating per-package commits
looks like it will be extremely useful.
However, I'm wondering about the consequences of bulk updates in
this style (one commit per package) for the common case where
packages have tightly coupled version dependencies. I.e.,
sometimes builds will fail if a set of N packages isn't updated
atomically. (I'm confronting such a case right now in the OCaml
ecosystem.)
Really this is a question about the trade-offs between keeping
changes small and focused, versus keeping the repository (or at
least the main branch) buildable at every commit.
I think the first of these two principles is pretty
uncontroversial, and the Guix manual codifies it as "Verify that
your patch contains only one set of related changes."
As for keeping the repo buildable [2], I couldn't find any
explicit mention of that in the manual. It does seem like
buildability is pretty good in practice - with Guix releases
historically being few and far between [3], it's nice that I can
run `git pull` at any time followed by `guix pull` and generally
have something that works. And of course, consistent buildability
also helps CI and other repository analysis tools (e.g., `git
bisect`) to work more efficiently.
So my questions are: does the community put much priority on
keeping the repository buildable at each commit (versus, say, at
the granularity of PRs)? And if it does, do I detect that there
are some exceptions made for bulk package updates?
Cheers,
Jason
[1]
https://guix.gnu.org/manual/devel/en/html_node/Bulk-Updates.html
[2] "Buildable" could be considered a matter of degree - e.g.,
Guile modules might compile even when the packages themselves
don't, and packages might build on some architectures but not
others. For the purposes of this disucssion I'm lumping all of
these together.
[3] I'm happy to see this improving with GCD 005!