On Sat, May 11, 2024 at 04:09:23PM +0200, Simon Josefsson wrote:
>    The current approach of running autoreconf -fi is based on a
>    misunderstanding: autoreconf -fi is documented to not replace certain
>    files with newer versions:
>    https://lists.nongnu.org/archive/html/bug-gnulib/2024-04/msg00052.html

And the root cause of *this* is because historically, people put their
own custom autoconf macros in aclocal.m4, so if autoreconf -fi
overwrote aclocal.m4, things could break.  This also means that
programmtically always doing "rm -f aclocal.m4 ; aclocal --install"
will break some packages.

The best solution to this is to try to promote people to put those
autoconf macros that they are manually maintaining that can't be
supplied in acinclude.m4, which is now included by default by autoconf
in addition to aclocal.m4.  Personally, I think the two names are
confusing and if it weren't for historical reasons, perhaps should
have been swapped, but oh, well....

(For example, I have some custom local autoconf macros needed to
support MacOS in e2fsprogs's acinclude.m4.)

> 1) Use upstream's PGP signed git-archive tarball.

Here's how I do it in e2fsprogs which (a) makes the git-archive
tarball be bit-for-bit reproducible given a particular git commit ID,
and (b) minimizes the size of the tarball when stored using
pristine-tar:

https://github.com/tytso/e2fsprogs/blob/master/util/gen-git-tarball

> To reach our goals in the beginning of this post, this upstream tarball
> has to be filtered to remove all pre-generated artifacts and vendored
> code.  Use some mechanism, like the debian/copyright Files-Excluded
> mechanism to remove them.  If you used a git-archive upstream tarball,
> chances are higher that you won't have to do a lot of work especially
> for pre-generated scripts.

Why does it *has* to be filtered?  For the purposes of building, if
you really want to nuke all of the pre-generated files, you can just
move them out of the way at the beginning of the debian/rules run, and
then move them back as part of "debian/rules clean".  Then you can use
autoreconf -fi to your heart's content in debian/rules (modulo
possibly breaking things if you insist on nuking aclocal.m4 and
regenerating it without taking proper care, as discussed above).

This also allows the *.orig.tar.gz to be the same as the upstream
signed PGP tarball, which you've said is the ideal, no?

> There is one design of gnulib that is important to understand: gnulib is
> a source-only library and is not versioned and has no release tarballs.
> Its release artifact is the git repository containing all the commits.
> Packages like coreutils, gzip, tar etc pin to one particular commit of
> gnulib.

Note that how we treat gnulib is a bit differently from how we treat
other C shared libraries, where we claim that *all* libraries must be
dynamically linked, and that include source code by reference is
against Debian Policy, precisely because of the toil needed to update
all of the binary packages should some security vulnerability gets
discovered in the library which is either linked statically or
included by code duplication.

And yet, we seem to have given a pass for gnulib, probably because it
would be too awkward to enforce that rule *everywhere*, so apparently
we've turned a blind eye.

I personally think the "everything must be dynamically linked" to be
not really workable in real life, and should be an aspirational goal
--- and the fact that we treat gnulib differently is a great proof
point about how the current debian policy is not really doable in real
life if it were enforced strictly, everywhere, with no exceptions....

Certainly for languages like Rust, it *can't* be enforced, so again,
that's another place where that rule is not enforced consistently; if
it were, we wouldn't be able to ship Rust programs.

                                                    - Ted

Reply via email to