Re: Debian NMU Sprint Thursday, June 6th 17:00 UTC!

2024-05-28 Thread Vagrant Cascadian
On 2024-05-22, Holger Levsen wrote:
> On Tue, May 21, 2024 at 01:21:10PM -0700, Vagrant Cascadian wrote:
>> I am hoping to schedule some Non-Maintainer Uploads (NMU) sprints,
>> starting with two thursdays from now...
>
> yay!
>
> though I won't be able to join on June 6th. what's the other Thursday
> you have in mind?

I had not planned any further than June 6th, but maybe roughly once or
twice a month? Or we could go all out and do them every week for a
while... Let us see what we thik after the one on the 6th.


>> Unapplied patches:
>>   
>> https://udd.debian.org/bugs/?release=sid=only=ign=ign=ign=1
>
> personally I'll go with this with two modifications:
> - trixie instead of sid (that's 164 packages currently instead of 184 and
>   it saves me from investing time in packages which are not fit for testing.)

UDD makes it pretty easy to customize a query to your particular
approach. I tried to keep the URL short in the email. :)

To me, "trixie and sid" seems like the best choice, as anything that is
in trixie but is not in sid ... is likely obsolete. Although UDD seems
unwilling to actually return results for "trixie and sid" at the
moment...

"trixie" only tends to display bugs that are already fixed in unstable
(or sometimes experimental)... although should show up with a
strikethrough.


> - sorted by popcon (though I fully agree oldest first is also a very value
>   metric)

*shrug* I just used the default sort order to keep the URL
manageable. :)


Another one that might be useful:

  exclude build path, using dist trixie:

  
https://udd.debian.org/bugs/?release=trixie=only=ign=ign=ign=ign=buildpath=reproducible-builds%40lists.alioth.debian.org=1

Given that tests.reproducible-builds.org is no longer testing build paths...

Though that might ignore some bugs that have other tags comingled with
buildpath tags. But also gives a much shorter list to poke at. :)


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Debian NMU Sprint Thursday, June 6th 17:00 UTC!

2024-05-21 Thread Vagrant Cascadian
I am hoping to schedule some Non-Maintainer Uploads (NMU) sprints,
starting with two thursdays from now...

Planning on meeting on irc.oftc.net in the #debian-reproducible channel
at 17:00UTC and going for an hour or two or three. Feel free to start
early or stay late, or even fix things on some other day!

We will have one or more Debian Developers available to sponsor uploads,
so even if you can't upload yourself but you know how to build a debian
package, please join us!


Unapplied patches:

  
https://udd.debian.org/bugs/?release=sid=only=ign=ign=ign=1

This list is sorted by the oldest bugs with patches not marked pending,
so we can target bugs that have just stalled out for whatever reason,
but feel free to pick bugs that scratch your particular itch.

We will want to make sure the patch still applies and/or refresh the
patches, make sure it still solves the issue, and update the bug report
where appropriate.


Documentation about performing NMUs:

  https://www.debian.org/doc/manuals/developers-reference/pkgs.html#nmu

We will be uploading the to the DELAYED queue (presumably between 10 and
15 days).


If the package has been orphaned we can generally upload without delay
(check the https://tracker.debian.org/PACKAGE page which usually lists
this) and mark it as maintained by "Debian QA Group
" if needed.

If you are impatient, try fixing QA packages, as you can upload fixes
without delays:

  
https://tests.reproducible-builds.org/debian/unstable/amd64/pkg_set_maint_debian-qa.html


Let's fix some bugs!


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: reproduceiable issue on -dbgsym package

2024-05-06 Thread Vagrant Cascadian
On 2024-05-06, Bo YU wrote:
> I have one package aemu[0] and its reprotest[1] test failed. However
> from the log online, I did not find any value cluster to fix the
> issue. So I use `reprotest` locally:

My guess would be build paths.

You could use reprotest --auto-build to systematically search for what
type of issue it is (e.g. first test only timestamps, then test only
build path, etc.). At the end it reports which variations triggered the
reproducibility issue.

If it is build paths, since it is a cmake project, you might try setting
from debian/rules:

  -DCMAKE_BUILD_RPATH_USE_ORIGIN=ON

This documents the issue a bit more:

  
https://tests.reproducible-builds.org/debian/issues/unstable/cmake_rpath_contains_build_path_issue.html


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Bug#1068890: diffoscope: --hard-timeout option

2024-04-16 Thread Vagrant Cascadian
On 2024-04-16, Chris Lamb wrote:
> However, I think this first iteration of --hard-timeout time has a few
> things that would need ironing out first, and potentially make it not
> worth implementing:
>
> (1) You suggest it should start again with "--max-container-depth 3",
> but it would surely need some syntax (or another option?) to control
> that "3" (but for the second time only).

What about going the other direction ... starting with a very small
value for max-container-depth, and incrementally increasing it,
generating a report (or at least storing sufficient data to generate
one) in between each increment, so you always get some information, but
essentially incrementally increase the resolution?

Or would that approach just be too inefficient?


> (2) In fact, its easy to imagine that one would want to restart with
> other restrictions as well: not just --max-container-depth. For
> instance, excluding external commands like readelf and objdump that
> you know to be slow.

Ah, yes, knowing the common time sinks would be tremendously helpful!


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Bug#1038845: reprotest: transition from /etc/timezone to /etc/localtime

2024-04-12 Thread Vagrant Cascadian
Control: block 1038845 by 1001250

On 2023-06-21, bl...@debian.org wrote:
> reprotest is currently referencing /etc/timezone without support for
> /etc/localtime. /etc/timezone is a legacy interface that is Debian
> specific. The cross-distro standard /etc/localtime (as a symlink to
> the appropriate timezone file), so please switch your package to
> /etc/localtime. tzsetup will stop creating /etc/timezone soon. Note
> that the list of affected source packages was compiled with
> codesearch, so false positives are possible. Thank you. 

This is only in the code running in a qemu virtual machine, although
that is currently broken, so needs to be fixed somehow to remove
/etc/timezone.

live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Bug#1068853: reprotest: SyntaxWarning: invalid escape sequence '\;'

2024-04-12 Thread Vagrant Cascadian
On 2024-04-12, Fay Stegerman wrote:
> * Vagrant Cascadian  [2024-04-12 19:29]:
>> On 2024-04-12, Holger Levsen wrote:
>> > when installing reprotest 0.7.27:
>> >
>> > SyntaxWarning: invalid escape sequence '\;'
>> > Setting up reprotest (0.7.27) ...
>> > /usr/lib/python3/dist-packages/reprotest/__init__.py:360: SyntaxWarning: 
>> > invalid escape sequence '\;'
>> >   run_or_tee(['sh', '-ec', 'find %s -type f -exec sha256sum "{}" \;' % 
>> > self.artifact_pattern],
> [...]
>> How exactly did you get this error?
>> 
>> I installed locally, but did not encounter any such issues on package
>> installation just now, and also nothing when manually running a simple
>> test:
>> 
>>   reprotest 'date > date' date
>> WARNING:reprotest:The control build runs on 1 CPU by default, give 
>> --min-cpus to increase this.
>> WARNING:reprotest.build:IGNORING user_group variation; supply more 
>> usergroups with --variations=user_group.available+=USER1:GROUP1;USER2:GROUP2 
>> or alternatively, suppress this warning with --variations=-user_group
>> WARNING:reprotest.build:Not using sudo for domain_host; your build may fail. 
>> See man page for other options.
>> WARNING:reprotest.build:Be sure to `echo 1 > 
>> /proc/sys/kernel/unprivileged_userns_clone` if on a Debian system.
>> --- /tmp/tmp4vqq6736/control
>> +++ /tmp/tmp4vqq6736/experiment-1
>> │   --- /tmp/tmp4vqq6736/control/source-root
>> ├── +++ /tmp/tmp4vqq6736/experiment-1/source-root
>> │ │   --- /tmp/tmp4vqq6736/control/source-root/date
>> │ ├── +++ /tmp/tmp4vqq6736/experiment-1/source-root/date
>> │ │ @@ -1 +1 @@
>> │ │ +L 13 apr   2024 07:27:01 GMT
>> │ │ -Fri Apr 12 05:27:01 GMT 2024
>
> That syntax warning is new in Python 3.12.  And it's correct, one should use 
> raw
> strings (r'...') or two backslashes for escape sequences intended for e.g.
> regexes or shell commands like here, not Python itself.

Ok, finally able to reproduce this by installing python3.12 in the
environment, which is not yet the default python or installed by
default, but obviously will be before too long...

That at least gives me enough to poke at this going forward!

Thanks!

live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Bug#1068853: reprotest: SyntaxWarning: invalid escape sequence '\;'

2024-04-12 Thread Vagrant Cascadian
On 2024-04-12, Holger Levsen wrote:
> when installing reprotest 0.7.27:
>
> SyntaxWarning: invalid escape sequence '\;'
> Setting up reprotest (0.7.27) ...
> /usr/lib/python3/dist-packages/reprotest/__init__.py:360: SyntaxWarning: 
> invalid escape sequence '\;'
>   run_or_tee(['sh', '-ec', 'find %s -type f -exec sha256sum "{}" \;' % 
> self.artifact_pattern],
> /usr/lib/python3/dist-packages/reprotest/build.py:315: SyntaxWarning: invalid 
> escape sequence '\$'
>   _ = _.append_setup_exec_raw('DROP_ARCH="-v -e ^$(uname -m)\$"')
> /usr/lib/python3/dist-packages/reprotest/build.py:317: SyntaxWarning: invalid 
> escape sequence '\$'
>   _ = _.append_setup_exec_raw('if [ $WORDSIZE -eq 64 ]; then \
> /usr/lib/python3/dist-packages/reprotest/environ.py:10: SyntaxWarning: 
> invalid escape sequence '\w'
>   "path": "(/\w{1,12}){1,4}",
> /usr/lib/python3/dist-packages/reprotest/environ.py:11: SyntaxWarning: 
> invalid escape sequence '\d'
>   "port": "([1-9]\d{0,3}|[1-5]\d{4})",
> /usr/lib/python3/dist-packages/reprotest/environ.py:12: SyntaxWarning: 
> invalid escape sequence '\w'
>   "domain": "\w{1,10}(\.\w{1,10}){0,3}",
> /usr/lib/python3/dist-packages/reprotest/environ.py:13: SyntaxWarning: 
> invalid escape sequence '\w'
>   "password": "\w{1,40}",
> /usr/lib/python3/dist-packages/reprotest/environ.py:14: SyntaxWarning: 
> invalid escape sequence '\w'
>   "username": "\w{2,20}",
> /usr/lib/python3/dist-packages/reprotest/environ.py:113: SyntaxWarning: 
> invalid escape sequence '\w'
>   "REPROTEST_CAPTURE_ENVIRONMENT_UNKNOWN_\w+"]
> /usr/lib/python3/dist-packages/reprotest/lib/adt_testbed.py:305: 
> SyntaxWarning: invalid escape sequence '\['
>   script = '''sed -rn 's/^(deb|deb-src) +(\[.*\] *)?([^ 
> ]*(ubuntu.com|debian.org|ftpmaster|file:\/\/\/tmp\/testarchive)[^ ]*) +([^ 
> -]+) +(.*)$/\\1 \\2\\3 \\5-%s \\6/p' /etc/apt/sources.list `ls 
> /etc/apt/sources.list.d/*.list 2>/dev/null|| true` > 
> /etc/apt/sources.list.d/%s.list; for retry in 1 2 3; do apt-get 
> --no-list-cleanup -o Dir::Etc::sourcelist=/etc/apt/sources.list.d/%s.list -o 
> Dir::Etc::sourceparts=/dev/null update 2>&1 && break || sleep 15; done''' % 
> (pocket, pocket, pocket)
> /usr/lib/python3/dist-packages/reprotest/lib/adt_testbed.py:320: 
> SyntaxWarning: invalid escape sequence '\/'
>   'for d in %s; do [ ! -d $d ] || touch -r $d %s/${d//\//_}.stamp; done' % (
> /usr/lib/python3/dist-packages/reprotest/lib/adt_testbed.py:342: 
> SyntaxWarning: invalid escape sequence '\/'
>   'for d in %s; do s=%s/${d//\//_}.stamp;'
> /usr/lib/python3/dist-packages/reprotest/lib/adt_testbed.py:724: 
> SyntaxWarning: invalid escape sequence '\('
>   script = '''d=%(t)s/deps
> /usr/lib/python3/dist-packages/reprotest/lib/adt_testbed.py:1211: 
> SyntaxWarning: invalid escape sequence '\/'
>   script += '''REL=$(sed -rn '/^(deb|deb-src) 
> .*(ubuntu.com|debian.org|ftpmaster|file:\/\/\/tmp\/testarchive)/ { s/^[^ ]+ 
> +(\[.*\] *)?[^ ]* +([^ -]+) +.*$/\\2/p}' $SRCS | head -n1); '''

How exactly did you get this error?

I installed locally, but did not encounter any such issues on package
installation just now, and also nothing when manually running a simple
test:

  reprotest 'date > date' date
WARNING:reprotest:The control build runs on 1 CPU by default, give --min-cpus 
to increase this.
WARNING:reprotest.build:IGNORING user_group variation; supply more usergroups 
with --variations=user_group.available+=USER1:GROUP1;USER2:GROUP2 or 
alternatively, suppress this warning with --variations=-user_group
WARNING:reprotest.build:Not using sudo for domain_host; your build may fail. 
See man page for other options.
WARNING:reprotest.build:Be sure to `echo 1 > 
/proc/sys/kernel/unprivileged_userns_clone` if on a Debian system.
--- /tmp/tmp4vqq6736/control
+++ /tmp/tmp4vqq6736/experiment-1
│   --- /tmp/tmp4vqq6736/control/source-root
├── +++ /tmp/tmp4vqq6736/experiment-1/source-root
│ │   --- /tmp/tmp4vqq6736/control/source-root/date
│ ├── +++ /tmp/tmp4vqq6736/experiment-1/source-root/date
│ │ @@ -1 +1 @@
│ │ +L 13 apr   2024 07:27:01 GMT
│ │ -Fri Apr 12 05:27:01 GMT 2024

live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Bug#1034311: reprotest: make it easier to compare against an existing build (eg from ftp.d.o)

2024-04-11 Thread Vagrant Cascadian
On 2024-03-08, Vagrant Cascadian wrote:
> On 2023-04-12, Holger Levsen wrote:
>>  i guess reprotest maybe should grow an option to do
>> --control-build /path/to/packages/ 
>> --vary=build_path=/use/this/build/path ... 
>>to make it easier to use reprotest to compare against an existing 
>> build
>>YES
>>  e.g. there is no way to disable buidl path variations when 
>> comparing
>> against an arbitrary build
>>i'm reporting this as a bug to the bts, quoting your words here. 
>> (ok?)
>>  reprotest can control it's own builds ... but if i want to use 
>> reprotest
>>against the archive packages or an sbuild 
>>or pbuilder build package ... it will always have a different 
>> build path
>
> Forgot about this bug, but I have since implemented a proof-of-concept
> of this:
>
>   
> https://salsa.debian.org/reproducible-builds/reprotest/-/commits/wip-specify-build-path?ref_type=heads

And merged in 0.7.27 ... which resolves the one specific issue mentioned
... are there any other must-haves we need to consider this bug closed?

Better documentation of how to actually do this? :)

live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Bug#847805: reprotest: document/support simple reproducibility test with sbuild

2024-04-11 Thread Vagrant Cascadian
On 2016-12-11, Sean Whitton wrote:
> On Sun, Dec 11, 2016 at 03:12:57PM -0700, Sean Whitton wrote:
>> I have sbuild properly set up on my machine, and I want to use it to
>> test package reproducibility.  Something like this, where PWD is an
>> unpacked source package:
>> 
>> 1) sbuild
>> 2) record .deb checksums from .changes file
>> 3) sbuild
>> 4) compare .deb checksums in new .changes file
>> 5) run diffoscope if the checksums differ
>
> Thanks to #debian-reproducible, this is mostly what I wanted:
>
> reprotest auto . -- schroot unstable-amd64-sbuild
>
> This doesn't actually invoke sbuild, but it does perform the builds
> inside the schroot I already have set up, and compare the results.
>
> This is useful, but it would also be good if reprotest could invoke
> sbuild(1) itself.  That is because sbuild has lots of useful options.
>
> For example, suppose that foo depends on dh_bar, and I am hacking on
> dh_bar in the hope of making foo reproducible.  Then I want to build foo
> against my local version of dh_bar.  With sbuild, I can do this using
> --extra-package and --add-depends.  reprotest with a pure schroot
> backend can't do that kind of thing, so far as I can tell.

A while back I did work on a simple wrapper for sbuild that calls
reprotest as part of a --finished-build-commands hook:

  https://salsa.debian.org/reproducible-builds/sbuild-unshare-reprotest

It is definitely quite rough around the edges and there are some caveats
that limit the functionality, but can do some of what you were looking
for.


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Bug#882511: dpkg-buildpackage: should allow caller to force inclusion of source in buildinfo

2024-04-10 Thread Vagrant Cascadian
On 2024-04-09, Guillem Jover wrote:
> I've now finished the change I had in that branch, which implements
> support so that dpkg-buildpackage can be passed a .dsc or a source-dir,
> and in the former will first extract it, and for both then it will
> change directory to the source tree. If it got passed a .dsc then it
> will instruct dpkg-genbuildinfo to include a ref to it.
>
> Which I think accomplishes the requested behavior in a safe way? I've
> attached what I've got, which I'm planning on merging for 1.22.7. I'll
> probably split that into two commits though before merging.

Had a chance to take this for a test run, and it appears to work, though
with a few surprises...

  dpkg-buildpackage -- hello_2.10-3.dsc

Ends up regenerating the .dsc, as --build=any,all,source by default
... which may end up with a different .dsc checksum in the .buildinfo
than .dsc that was passed on the commandline. Which makes some sense,
but maybe would be better to error out? I would not expect to regenerate
the .dsc if you're passing dpkg-buildpackage a .dsc!


  dpkg-buildpackage --build=any,all -- /path/to/hello_2.10-3.dsc

Fails to find the .dsc file, as it appears to extract the sources to
hello-2.10 and then expects to find ../hello_2.10-3.dsc


All that said ... this seemed to work for me:

  dpkg-buildpackage --build=any,all -- hello_2.10-3.dsc

So yay, progress! Thanks!


All of the above cases do not clean up the hello-2.10 extracted from the
.dsc file, so re-running any of the above need to manually clean that or
run from a clean directory or experience various failure modes with the
existing hellp-2.10 directory.


So a few little glitches, but overall this seems close to something we
have really wanted for reproducible builds! And just for good measure,
thanks!


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Bug#1034311: reprotest: make it easier to compare against an existing build (eg from ftp.d.o)

2024-03-08 Thread Vagrant Cascadian
On 2023-04-12, Holger Levsen wrote:
>  i guess reprotest maybe should grow an option to do
> --control-build /path/to/packages/ 
> --vary=build_path=/use/this/build/path ... 
>to make it easier to use reprotest to compare against an existing 
> build
>YES
>  e.g. there is no way to disable buidl path variations when 
> comparing
> against an arbitrary build
>i'm reporting this as a bug to the bts, quoting your words here. 
> (ok?)
>  reprotest can control it's own builds ... but if i want to use 
> reprotest
>against the archive packages or an sbuild 
>or pbuilder build package ... it will always have a different 
> build path

Forgot about this bug, but I have since implemented a proof-of-concept
of this:

  
https://salsa.debian.org/reproducible-builds/reprotest/-/commits/wip-specify-build-path?ref_type=heads

:)

live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Verification Builds and Snapshots For Debian

2023-10-13 Thread Vagrant Cascadian
On 2023-10-12, Marek Marczykowski-Górecki wrote:
> On Sat, Sep 30, 2023 at 04:59:33PM -0700, Vagrant Cascadian wrote:
>> On 2023-09-20, Lucas Nussbaum wrote:
>> > On 19/09/23 at 13:52 -0700, Vagrant Cascadian wrote:
>> >> Snapshotting the archive(s) multiple times per day, today, tomorrow, and
>> >> going forward will at least enable doing verification rebuilds of
>> >> packages starting from this point, with less immediate overhead than
>> >> trying to replicate the entire functionality or more complete history of
>> >> snapshot.debian.org.
>> 
>> In the meantime, I worked on a naive implementation of this, using
>> debmirror and btrfs snapshots (zfs or xfs are other likely candidates
>> for filesystem-level snapshots). It is working better than I expected!
>
> Isn't this more or less what has been tried few times before, and it
> works only until you load it with years worth of data?

Well, then we partition it off by year or whatever size unit it works
at? It is also possible improvements in the underlying filesystem and
disk technologies over the years make it more viable now than in the
past?

I definitely don't have more than a few months using this method at this
time, so sure, it could all come screetching to a halt at some
point...

I can continue to backfill from snapshot.debian.org until it breaks, I
suppose. :)

live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Verification Builds and Snapshots For Debian

2023-10-12 Thread Vagrant Cascadian
On 2023-10-12, Chris Lamb wrote:
>> In the meantime, I worked on a naive implementation of this, using
>> debmirror and btrfs snapshots (zfs or xfs are other likely candidates
>> for filesystem-level snapshots). It is working better than I expected!
> […]
>> Currently weighing in at about 550GB, each snapshot of the archive for
>> amd64+all+source is weighing in under 330GB if I recall correctly... so
>> that is over a month worth of snapshots for the cost of about two full
>> snapshots. Obviously, adding more architectures would dramatically
>> increase the space used (Would probably add arm64, armhf, i386, ppc64el
>> and riscv64 if I were to do this again).
>
> This sounds like great progress. :)  Do you have any updates since you
> posted your message?

It's still running! And now I have one running with xfs filesystem, and
one on btrfs. Only the xfs one is publicly available via:

  http://snapshot.reproducible-builds.org/snapshot-experiment 

Which only started earlier this month, but in theory could pull in the
updates from the btrfs snapshots for a little more redundancy.

Also managed to backfill from snapshot.debian.org some older
generations, maybe as far back as july? That's only available on the
currently not publicly accesible btrfs implementation.

Could probably set up some proxy to make the ones on btrfs available
publicly too.

> (Are you snapshotting after each dinstall and labelling them with some
> timestamp…? Or perhaps you have some other, cleverer, scheme?)

The timestamp i am using is the most recent timestamp from any relevent
Release file. This way, the timestamp for any given mirror state,
(presuming you are mirroring the same distributions and architectures),
should match if you had two snapshots running independently.

For the main repositories (e.g. not security or incoming), I am syncing
from a leaf mirror that happens to be very close on the network, so I
just schedule it to run from cron roughly when I expect the leaf mirror
to be finished updating, and then a second pass some hours later just in
case so we are less likely to miss a snapshot generation or get an
incomplete generation. Really want to avoid missing snapshots or partial
snapshots; that could certainly use some more solid error checking, as
it mostly relies on debmirror doing the right thing.

It also is currently missing debian-installer images, though I *think*
that would be reasonably easy to add by passing more arguments to
debmirror. For the first proof-of-concept I focused on .deb and .udeb
packages, to be able to rebuild packages.


live well,
  vagrant

___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Verification Builds and Snapshots For Debian

2023-10-06 Thread Vagrant Cascadian
On 2023-09-30, Vagrant Cascadian wrote:
> On 2023-09-20, Lucas Nussbaum wrote:
>> On 19/09/23 at 13:52 -0700, Vagrant Cascadian wrote:
>>> * Looking forward and backwards at snapshots
>>> 
>>> I do think that a more complete snapshot approach is probably better
>>> than package-specific snapshots, and it might be worth doing
>>> forward-looking snapshots of ftp.debian.org (and security.debian.org and
>>> incoming.debian.org), in addition to trying to fill out all the missing
>>> past snapshots to be able to attempt verification builds of older
>>> packages, such as all of bookworm.
>>> 
>>> Snapshotting the archive(s) multiple times per day, today, tomorrow, and
>>> going forward will at least enable doing verification rebuilds of
>>> packages starting from this point, with less immediate overhead than
>>> trying to replicate the entire functionality or more complete history of
>>> snapshot.debian.org.
>
> In the meantime, I worked on a naive implementation of this, using
> debmirror and btrfs snapshots (zfs or xfs are other likely candidates
> for filesystem-level snapshots). It is working better than I expected!

xfs seems to work well enough too, by using "cp --archive --reflink" to
produce snapshots:

  http://snapshot.reproducible-builds.org/snapshot-experiment/archive/debian/

A little klunkier than btrfs snapshots, but workable.

It would also be possible with btrfs or xfs (and presumably zfs somehow)
to make by-checksum reflinked copies, which might make it possible to
(at least partially) reassemble a repository from the contents if you
have the relevent Release, Packages and Sources files...

I have not migrated all of the snapshots captured on btrfs ... because
it is a bit tricky to sync the files efficiently... What becomes tricky
is efficiently transferring those files to another machine or making
backups... rsync can break the reflinked files if you are not
careful. There are patches for rsync to support efficiently reflink:

  https://github.com/WayneD/rsync/issues/119
  https://github.com/WayneD/rsync/issues/153


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Verification Builds and Snapshots For Debian

2023-09-30 Thread Vagrant Cascadian
On 2023-09-20, Lucas Nussbaum wrote:
> On 19/09/23 at 13:52 -0700, Vagrant Cascadian wrote:
>> * Looking forward and backwards at snapshots
>> 
>> I do think that a more complete snapshot approach is probably better
>> than package-specific snapshots, and it might be worth doing
>> forward-looking snapshots of ftp.debian.org (and security.debian.org and
>> incoming.debian.org), in addition to trying to fill out all the missing
>> past snapshots to be able to attempt verification builds of older
>> packages, such as all of bookworm.
>> 
>> Snapshotting the archive(s) multiple times per day, today, tomorrow, and
>> going forward will at least enable doing verification rebuilds of
>> packages starting from this point, with less immediate overhead than
>> trying to replicate the entire functionality or more complete history of
>> snapshot.debian.org.

In the meantime, I worked on a naive implementation of this, using
debmirror and btrfs snapshots (zfs or xfs are other likely candidates
for filesystem-level snapshots). It is working better than I expected!

It currently has snapshots for debian amd64 on bookworm,
bookworm-backports, bookworm-proposed-updates, trixie, sid and
experimental (or I guess, rc-buggy...), and debian-security for
bookworm-security, and this might be a little redundant, but just in
case, also incoming.debian.org for most of the above codenames as well
starting between september 20th and 22nd (with some gaps as I was
sorting out what was worth capturing; currently does not include
debian-installer images, for example, and some generations missed
.udebs). Soon it will start capturing October, and beyond! The machine
it is running on happens to be very close to a debian mirror, which is
helpful! It also seems to have caught some snapshot generations that
snapshot.debian.org missed!

I also tried to backfill out some snapshots from snapshot.debian.org for
"debian" and "debian-security" for roughly the same codenames, with more
success than I expected, capturing all of september and edging into
august so far. Hope to get as far as maybe june, so that anything built
since the bookworm release can has relevent snapshots. It mostly works,
although once and a while I appear to trip some download limits and it
stalls out.

Currently weighing in at about 550GB, each snapshot of the archive for
amd64+all+source is weighing in under 330GB if I recall correctly... so
that is over a month worth of snapshots for the cost of about two full
snapshots. Obviously, adding more architectures would dramatically
increase the space used (Would probably add arm64, armhf, i386, ppc64el
and riscv64 if I were to do this again).


I'm in the process of using this snapshot mirror calling out to
grep-dctrl and dose-builddebcheck (look mom, no database!) to generate
apt sources.list entries pointing to the appropriate snapshots for each
.buildinfo from september, and eventually perform verification builds
for each of these. I think it covers roughly 6000 .buildinfo files,
which is not nothing!


>> I wonder if having multiple snapshot.debian.org implementations might
>> actually be a desireable thing, as it is so essential to the ability to
>> do long-term reproducible builds verification builds, and having
>> additional independent snapshots could provide redundancy and the
>> ability to repair breakages if one of the services fails in some way.
>
> What is the state of efforts regarding alternate snapshot.d.o
> implementations?

The main one I was aware of:

  https://github.com/fepitre/debian-snapshot

I believe snapshot.reproducible-builds.org which used this is currently
on hiatus, but I hope see that picked up again in 2024, possibly with a
different implementation...


> Has someone explored an implementation backed by S3-compatible storage,
> which would easily allow hosting it in a cloud?

No idea, but multiple options would be good! Would probably want to use
a lot of redundancy (multiple S3 providers, multiple "local" mirrors,
etc.), just because this sort of thing is so difficult to fix
retroactively (if possible at all)...

How difficult is it to implement deduplication with S3 storage? Saw a
few hits with a quick search...


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Bug#1052257: diffoscope crashes(?) comparing some i386 debs (and others)

2023-09-27 Thread Vagrant Cascadian
On 2023-09-27, Holger Levsen wrote:
> On Wed, Sep 20, 2023 at 12:25:36PM +, Holger Levsen wrote:
>> so I've powercycled the machine and also disabled the armhf workers now.
>> (under the (weak) assumption that this bug is mostly trigged when running
>> diffoscope on 32bit .debs...)

Most frustrating!


> so on Sep 23rd I made diffoscope run under ionice -n6, and so far the machine
> has gone down on its knees yet. 
>
> and on Sep 25th I said this on #debian-reproducible:
>
> when i made diffoscope run under ionice i was wondering if there were changes 
> in the linux 
> scheduler causing the probs we saw... now not seeing the machine go down to 
> its knees 
> within 60h uptime i'm saying this to "provoke" the problem coming back
> however, looking at 
> jenkins.debian.net/munin/static/dynazoom.html?cgiurl_graph=/munin-cgi/munin-cgi-graph_name=debian.net/jenkins.debian.net/uptime_x=800_y=400_epoch=1661108749_epoch=1695668749
> I can that the highest uptime was 15 days..
> (since upgrading to bookworm which is when the probs started)
> it could also be kernel related, with switching to bookworm
> we switched from 5.10.0 to 6.1.0...
> so my next stabs c/would be: 
> a.) increase swap 
> b.) upgrade to kernel 6.4.0 from bpo
> c.) something else

Maybe try using a bullseye 5.10.x kernel for a while? Obviously better
if the issue is fixed in a newer kernel version ... but if 5.10.x
consistently works with bookworm userspace that ever so slightly narrows
down the issue?


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Verification Builds and Snapshots For Debian

2023-09-20 Thread Vagrant Cascadian
On 2023-09-19, Vagrant Cascadian wrote:
> * Some actual results!
>
> Testing only arch:all and arch:amd64 .buildinfos, I had decent luck with
> 2023/09/16:
>
> total buildinfos to check: 538
> attempted/building: 535
>
> unreproducible: 28  5 %
> reproducible:   461 85 %
> failed: 46  8 %
> unknown:3   0 %
...
> I also had similar results for 2023-09-15 and 2023-09-17, but ... this
> morning most of those results myseriously disappeared!?! No idea what
> happened to them.

It was nagging at me, so I re-ran the builds for those days, and it was
not too bad, and looks similar to the initial results ... from memory:

2023/09/15
total buildinfos to check: 151
attempted/building: 151
unreproducible: 34  22 %
reproducible:   97  64 %
failed: 20  13 %
unknown:0   0 %

2023/09/17
total buildinfos to check: 152
attempted/building: 149
unreproducible: 10  6 %
reproducible:   125 82 %
failed: 14  9 %
unknown:3   1 %

Or, for all three days 2023-09-15 to 2023-09-17 combined:

total buildinfos to check: 839
attempted/building: 835
unreproducible: 72  8 %
reproducible:   683 81 %
failed: 80  9 %
unknown:4   0 %

Still not bad for real-world testing.


Also, my test environments unintentionally introduced a few more
variations, for example:

+Build-Tainted-By:
+ merged-usr-via-aliased-dirs

The thorn in Debian's side strikes again. The tarballs I was using were
usrmerge, but the buildds are still not doing usrmerge. This is more
fiddly to set up a non-usrmerge base tarball than it used to be, but it
is doable, and at least the .buildinfo records this information.


Mysterious discrepancies in dependency differences:

  libfontconfig-dev (= 2.14.2-6),
  libfontconfig1 (= 2.14.2-6),
- libfontconfig1-dev (= 2.14.2-6),

Apparently libfontconfig-dev provides libfontconfig1-dev, and
libfontconfig1-dev is a transitional package, and sometimes
dpkg-genbuildinfo decides to include it explicitly and... sometimes not?
I do not think this particular case is likely to change the build
results, at least.


 Environment:
- DEB_BUILD_OPTIONS="parallel=6"
+ DEB_BUILD_OPTIONS="parallel=7"
+ LANG="C.UTF-8"
  LC_ALL="C.UTF-8"

Could have set up the builds to use the same level of parallelism easily
enough.

LANG was trickier. Some of the buildd .buildinfo files explicitly set
LANG="C.UTF-8" but some have it undefined. If I left it unset, it ended
up using LANG="en_US.UTF-8".  I chose to consistently use LANG="C.UTF-8"
in my testing.  Although I am not even entirely sure C.UTF-8 is a valid
value for LANG...


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Verification Builds and Snapshots For Debian

2023-09-19 Thread Vagrant Cascadian
I experimented with verification builds building packages that were
recently built by the Debian buildd infrastrcture... relatively soon
after the .buildinfo files are made available, without relying on
snapshot.debian.org... with the goal of getting bit-for-bit identical
verification of newly added packages in the Debian archive.

Overall, I think the results are promising and we should actually try
something kind of like this in a more systematic way!


Fair warning, this has turned into quite a long email...


* Background

For the most part in Debian, we have been doing CI builds, where a
package is built twice and the results compared, but it is not verifying
packages in the official Debian archive. It is useful, especially for
catching regressions in toolchains and such, but verifying the packages
people actually use is obviously desireable.

In order to actually perform a verification build, you need the exact
same packages installed in a build environment...

There was a beta project performing verification builds that appears to
have stalled sometime in 2022:

  https://beta.tests.reproducible-builds.org/

From what I recall, one of the main challenges was the reliability of
the snapshot.debian.org service which lead to the development of an
alternative snapshotting service, although that is currently not yet
completed...

At some point, debsnapshot was used to perform some limited testing, but
this was also dependent on a reliable snapshot.debian.org.

There have been several other attempts are rebuilders for debian, but
the main challenge usually seems to come down to a working snapshot
service in order to be able to sufficiently reproduce the build
environment a package was originally built in...


* Summary of approach for this experiment

Copy a .buildinfo file from either
coccia.debian.org:/srv/ftp-master.debian.org/buildinfo/2023/09/16 or
https://buildinfos.debian.net/ftp-master.debian.org/buildinfo/2023/09/16
or other dates, but something fairly recent for best results...

Create a package-specific snapshot of all the exact versions of packages
in the .buildinfo file (Installed-Build-Depends).

Build a package with the exact versions from the .buildinfo file added
as build-dependencies, with the package-specific snapshot added to
available repositories(as well as a bunch of others), leveraging "sbuild
--build-dep-resolver=aptitude" to resolve the potentially complicated
build dependencies.

This supports sid and experimental reasonably well, including binNMUs.
It also supports the few bookworm-proposed-updates and
bookworm-backports .buildinfo files to some degree. Not sure where to
get .buildinfo files from debian-security, but would love to test those
as well! In theory it supports trixie as well, but nearly all packages
for trixie currently get built in sid/unstable rather than directly in
trixie.

I found that building sid and experimental worked best starting with a
slightly out-of-date trixie tarball, as it was almost always easier to
upgrade packages than to downgrade. Currently bookworm-proposed-updates
and bookworm-backports are fairly stable, although possibly the same
issue might apply.


* Package specific snapshots vs. complete snapshots

I have mixed feelings on the package-specific snapshots. It solves the
problem of getting old versions of packages to verify the build (or at
least could, with a bit more work), but with some drawbacks (custom apt
keyring, redundant information in *many* little snapshots, kind of
complicated).

Having explored package-specific snapshots, I think a better approach
might be to make forward-looking snapshots of ftp.debian.org,
incoming.debian.org and ideally security.debian.org (in addition to
snapshot.debian.org or a replacement)...

With locally available complete snapshots, each .buildinfo can be
processed as soon as possible to find the list of snapshots that would
satisfy the dependencies (to reduce the likelihood of having to rummage
through older snapshots to find dependencies)... and make an addendum to
the .buildinfo file that includes enough information to fully resolve
all the build dependencies... allowing the build to be performed at some
other time. This addendum might also need to recommend a snapshot for
the build chroot or base tarball, though that might be a bit trickier.

This could avoid having to leverage something like metasnap.debian.net,
that can process a .buildinfo and spit out the relevent sanpshots.


* The Code

My proof of concept collection of scripts, configuration and and total
lack of documentation:

  
https://salsa.debian.org/reproducible-builds/debian-verification-build-experiment

In retrospect, I should clearly have started by poking more at
debrebuild and other prior art... oops!

This also did not handle the syncing of the .buildinfo files at all,
which I did manually for this experiment, but that is a fairly
straightforward problem, and buildinfos.debian.net does this already.


* Some actual results!


Bug#1042918: reprotest: FTBFS with tox 4

2023-08-02 Thread Vagrant Cascadian
Control: tags 1042918 pending

On 2023-08-02, Stefano Rivera wrote:
> I thought we'd managed to avoid this, in #1035645, but we just did the
> transition, and I see reprotest is FTBFS:
...
> py311: commands[0] .pybuild/cpython3_3.11_reprotest/build> 
> .tox/py311/bin/python -m coverage r
> un --omit '.tox/*' --parallel -m py.test tests/
> __path__ attribute not found on 'py' while trying to find 'py.test'
> py311: exit 1 (0.09 seconds) /<>> .tox/py311/bin/python -m 
> coverage run --omit '.
> tox/*' --parallel -m py.test tests/ pid=7370
>   py311: FAIL code 1 (2.31=setup[2.22]+cmd[0.09] seconds)
>   evaluation failed :( (2.36 seconds)
> E: pybuild pybuild:388: test: plugin distutils failed with: exit code=1: cd 
> /<>/.
> pybuild/cpython3_3.11_reprotest/build; tox -c /<>/tox.ini 
> --sitepackages --instal
> lpkg 
> /<>/.pybuild/cpython3_3.11_reprotest/reprotest-0.7.25-py3-none-any.whl
>  -e py311 
> dh_auto_test: error: pybuild --test --test-tox -i python{version} -p 3.11 
> --test-tox returned exit code 13
>
>
> I'm guessing you want to replace py.test there with pytest.

Great suggestion! Worked for me locally and in salsa-ci.

Pushed to git:

  
https://salsa.debian.org/reproducible-builds/reprotest/-/commit/3d01047ae75ffc3fc30ad11aeec1a0dd9994d849

live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: FTBFS on all recent uploads

2023-07-27 Thread Vagrant Cascadian
On 2023-07-27, Peter Blackman wrote:
> I can see qt6ct has been rescheduled. Thanks.
>
> Looks OK on my packages overview, and the tracker,
> but is actually unreproducible!

This is because those only track weather trixie/testing is reproducible,
and unstable and experimental test more variations.


> FWIW, the note looks wrong.
> https://tests.reproducible-builds.org/debian/rb-pkg/unstable/amd64/qt6ct.html
>
>
> The note says
>      timestamps_in_source_generated_by_rcc

Yeah, I suspect quite a few of that particular issue are misfiled,
because many of them consistently build fine in trixie, just not
unstable...

Compare the unstable results to trixie:

  
https://tests.reproducible-builds.org/debian/issues/unstable/timestamps_in_source_generated_by_rcc_issue.html

  5 FTBS, 66 unreproducible

  
https://tests.reproducible-builds.org/debian/issues/trixie/timestamps_in_source_generated_by_rcc_issue.html

  49 reproducible, 2 FTBFS, 18 unreproducible

So my hunch is only ~18 of those are correctly marked, as timestamps
variations are more-or-less always tested on all suites.


> but the logs show the problem to be with NT_GNU_BUILD_ID.
> Usually an rpath issue.
> https://reproducible-builds.org/docs/deterministic-build-systems/#cmake-notes

Yeah, probably something triggered by build paths, which is not varied
in trixie/testing. The typical cmake rpath issue is fairly easy to
verify, by passing flags to configure:

  
https://tests.reproducible-builds.org/debian/issues/unstable/cmake_rpath_contains_build_path_issue.html

Although sometimes there may be other causes.

Thanks for looking into it, hope that gives some good leads!

live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: FTBFS on all recent uploads

2023-07-25 Thread Vagrant Cascadian
On 2023-07-25, Peter B wrote:
> /"We had a few glitches after the bookworm released, but I *think* they 
> should be sorted out by now... FTBFS do get 
> rescheduled fairly often, and if you really need, they can be manually 
> rescheduled (if you are a debian developer, you 
> should be able to click on the triangle of arrows to reschedule)./"
>
> Vagrant Cascadian vagr...@reproducible-builds.org
> Thu Jun 22 18:25:48 BST 2023
>   (I was not subscribed at the time)
...
> I have several packages FTBFS from over a month ago. No sign of any reruns.
> (One got fixed when I uploaded a new version.)

according to:

  https://tests.reproducible-builds.org/debian/index_performance.html

Looks like the whole archive is rebuilt roughly every 2 to 2.5 months.

As you noticed, new versions get priority, and FTBFS *should* get
scheduled more often, as well... but who knows, sometimes things fall
into cracks...


> Wondering if there is any chance for maintainers without an SSO sign-on
> to be able to schedule reruns for their own packages?

At the moment, I do not believe so. I think sso.debian.org will
eventually be retired and so another option will need to be explored at
some point...

You can also try asking for a few specific ones to be rescheduled by
dropping by irc.oftc.net #debian-reproducible or asking for specific
packages to be rescheduled on via this mailing list.


If your packages are hosted on salsa.debian.org, you can also use the CI
pipelines, which includes reprotest testing:

  https://salsa.debian.org/salsa-ci-team/pipeline/-/blob/master/README.md

It is not the same as our tests.reproducible-builds.org infrastructure,
but should help identify most issues before uploading to Debian!


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: FTBFS on all recent uploads

2023-06-22 Thread Vagrant Cascadian
On 2023-06-22, Boyuan Yang wrote:
> 在 2023-06-20星期二的 06:31 +0200,Joachim Zobel写道:
>> I have a FTBFS on all uploads after the end of the freeze. Example:
>> https://tests.reproducible-builds.org/debian/rb-pkg/unstable/amd64/gap-aclib.html
>> 
>> All these packages had been changed to compat 13. Unfortunately I have
>> no idea why I get these. I can see two logs of successful builds and a
>> diff for them. At the end of each log I find reproducible ➤ FTBFS, so
>> this probably is because my build fails to be reproducible. However the
>> only diff I see is the diff between the build logs. 
>> 
>> Any hints on what to do about these would be helpful.
>
> In the link above, when clicking "build2" under amd64 1.3.2-3--build2, the log
> shows that the build machine raised an error before your source code
> actually got built. It might be a machine configuration error. Forwarding
> your email to Debian reproducible-builds mailing list since they may know
> more about it.

We had a few glitches after the bookworm released, but I *think* they
should be sorted out by now...

FTBFS do get rescheduled fairly often, and if you really need, they can
be manually rescheduled (if you are a debian developer, you should be
able to click on the triangle of arrows to reschedule).

live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: usrmerge testing in our CI

2023-06-19 Thread Vagrant Cascadian
On 2023-06-17, Holger Levsen wrote:
> On Sat, Jun 17, 2023 at 02:08:20PM -0700, Vagrant Cascadian wrote:
>> From what I recall looking at the log posted in irc it might be
>> sufficient to "apt autoremove usrmerge" after the fact, as usrmerge is
>> not really reversible... e.g. the /bin -> /usr/bin and other symlinks
>> should continue to persist.
>
> yes, we have done something like this in the past. for now, I just
> wanted to get the builds back first.
>
> and this seems to have worked!
>
> quoting https://tests.reproducible-builds.org/debian/index_performance.html
>
> packages tested on 2023-06-16 2   0   1   0
> packages tested in the last 24h   3665604323221652

After another fix, yes, it is working! :)


>> > as said originally: we are the reproducible builds project, not the 
>> > Debian QA project trying to find as many issues as we can.
>> It is a judgement call just how much of the variations are "just" QA and
>> how much are important reproducible builds issues...
>> 
>> I mean, sure, the argument can be made that a usrmerge and non-usrmerge
>> environment are not the same build environment, even if they otherwise
>> have the same packages installed!
>
> well, for sure that's a different environment. hence some want to test,
> whether builds in those still result in the same result. :)

Exactly...


>> I see the purpose of testing exceptional things as actual reproducible
>> builds issues, and even if unlikely to occur in Debian, they might help
>> out some other project that makes different choices than Debian.
>
> and I'd like to focus on setting up rebuilders, finally and properly.
>  
> or, in other words: I want to stop doing^wfocussing on QA and improve
> the real world.
>
> and I'm also sure we can do both. but until we *have* rebuilders and
> a working snapshot mirror, I'd personally like to spent time on CI.
> And don't get me wrong: I want to keep maintaining the CI, but with
> less priority and with less priority on introducing or maintaining
> variations which have little real world impact.
>
> If you want to rebuild^wreproduce a build, you'll install whatever
> packages where installed in the 1st build. voila.

Sure, we have agreement on that, at least on the surface! :)


>> > I'd rather have us focus on real reproducibility issues, than issues
>> > with a variation we, Debian, don't care about.
>> I would also like that we focus on real reproducibility issues, with
>> variations that we, Reproducible Builds and Debian, care about. :P
>
> I honestly don't care as much about our CI build results as what
> Debian publishes on ftp.d.o - CI builds are just a mean to an end.

Well, I think the most important feature of the CI builds are for
catching reproducibility regressions! Which happen far more often than I
would like...


>> >> > I'm less clear about whether we should cease testing bookworm. It
>> >> > doesn't seem right for the CI to claim that various [bookworm]
>> >> > packages are "reproducible in bookworm", when the presence of usrmerge
>> >> > (or lack thereof) in bookworm means that they can still vary.
>> > bookworm wasn't build with usrmerge variation, but rather usrmerge
>> > was explicitly disabled on the builds.
>> For clarity, you mean on buildd.debian.org?
>
> no. I mean in our CI, for the 2nd build.
>
> the code I disabled yesterday was:
>
>case "${SUITE}" in
>buster|bullseye);;
>*)  pbuilder_options+=(--extrapackages usrmerge)
>;;
>esac
>
> (see 675175178cc)

That looks to me like bookworm was at least intended to be built with
usrmerge enabled, as it was in the "*" rather than "buster|bullseye"
match.


>> > so one can also say that it doesnt seem right to introduce a variation
>> > which never occured.
>> I am near-absolutely-positive that it did occur on many developer
>> machines, many of whom hopefully signed and uploaded their .buildinfo
>> files...
>
> those builds didnt make it to testing, because the release team requires
> that packages are build on buildds.

But the developer .buildinfo files are available. That is what I am
talking about.

Longer-term, I would love to see the workflow for uploading to Debian
be something more like:

  Developer builds and pushes source + .buildinfo to debian infrastructure,
  buildd builds the .buildinfo on two different buildd machines,
  if all three match, upload to the actual archive...

Obviously, we are quite a long ways from there...

The ship has basically sailed for bookworm, 

Re: usrmerge testing in our CI

2023-06-17 Thread Vagrant Cascadian
On 2023-06-17, Holger Levsen wrote:
> first: i've temporarily disabled testing usrmerge variation last
> night as this broke our builds, because the .buildinfo files vary
> (usrmerge installed or not), causing basically all builds to fail.

Ah well!

From what I recall looking at the log posted in irc it might be
sufficient to "apt autoremove usrmerge" after the fact, as usrmerge is
not really reversible... e.g. the /bin -> /usr/bin and other symlinks
should continue to persist.


> second: this has been happening since a few weeks, I still don't
> get why this suddenly stopped working as we have varied usrmerge
> since 2020.

I think at several times during the bookworm cycle the techniques had to
be adjusted in order to actually test usrmerge variations... and this
just seems like a new surprise in a long list of surprises...


> third: the code which was in use (since then) was varying usrmerge
> everywhere except buster & bullseye!

I am confident during the bullseye release cycle it was actually
enabled, and then once it was released it was changed or just not tested
for some reason.


> On Thu, Jun 15, 2023 at 09:55:38AM -0700, Vagrant Cascadian wrote:
>> On 2023-06-15, Chris Lamb wrote:
>> Off the top of my head, I do not know how many, but definitely some of
>> the usrmerge related bugs I have found will successfully build in a
>> usrmerge environment, but in a way that quietly breaks the package
>> (e.g. junk in manpages or other documentation, entirely missing
>> documentation, embedding an entirely sensical path, etc.). It would be
>> hard to systematically find these bugs without testing builds in
>> usrmerge and non-usrmerge environments. And I am not sure any other
>> project makes any more sense to do that sort of testing.
>
> as said originally: we are the reproducible builds project, not the 
> Debian QA project trying to find as many issues as we can.

It is a judgement call just how much of the variations are "just" QA and
how much are important reproducible builds issues...

I mean, sure, the argument can be made that a usrmerge and non-usrmerge
environment are not the same build environment, even if they otherwise
have the same packages installed!

How many times have we had people ask us to "just ignore timestamps"? Of
course, that is impractical to actually ignore due to checksum
mismatches or maintaining a consistent state system clock.

How about /bin/sh -> bash vs. /bin/sh -> dash? The transition is
arguably complete.

Build paths? These in theory are easy to reproduce and build in the same
path, but in practice there are a lot of real-world build path
variations on Debian infrastructure...


I see the purpose of testing exceptional things as actual reproducible
builds issues, and even if unlikely to occur in Debian, they might help
out some other project that makes different choices than Debian.


> I'd rather have us focus on real reproducibility issues, than issues
> with a variation we, Debian, don't care about.

I would also like that we focus on real reproducibility issues, with
variations that we, Reproducible Builds and Debian, care about. :P


>> > I'm less clear about whether we should cease testing bookworm. It
>> > doesn't seem right for the CI to claim that various [bookworm]
>> > packages are "reproducible in bookworm", when the presence of usrmerge
>> > (or lack thereof) in bookworm means that they can still vary.
>
> bookworm wasn't build with usrmerge variation, but rather usrmerge
> was explicitly disabled on the builds.

For clarity, you mean on buildd.debian.org?


> so one can also say that it doesnt seem right to introduce a variation
> which never occured.

I am near-absolutely-positive that it did occur on many developer
machines, many of whom hopefully signed and uploaded their .buildinfo
files...

Even some builds on buildd.debian.org were done with usrmerge enabled,
as debootstrap at various points defaulted to creating a usrmerge chroot
by default. And then did not. And then did. And I am not even sure what
the default is anymore. So even if people were using sbuild, pbuilder,
etc. for a clean build environment, they very likely uploaded packages
and corresponding .buildinfo files throughout the bookworm release cycle
that were built in both usrmerge and non-usrmerge environments...

Yes most of those packages did not actually make it into bookworm,
thanks to an awesome release team policy, but I would not be surprised
if some that used build-depends on some of these maintainer-built
packages ended up in bookworm... and of course a few builds that
happened on buildds with usrmerge enabled might have also slipped
through?

So, it is significantly more than never, at the very least. Never is a
pretty low bar to disprove. :)


>> I find it a

Re: usrmerge testing in our CI

2023-06-15 Thread Vagrant Cascadian
On 2023-06-15, Chris Lamb wrote:
> Holger Levsen wrote:
>
>>  as soon as buildds are merged, varying trixie no longer makes 
>>  sense to me in either case
> […]
>> so should we stop testing usrmerge variations at all now?
>
> Thanks for taking this to the list. For 100% clarity, I understand
> Holger's "at all" suffix to mean, "shall we disable the usrmerge
> variation for bookworm as well as trixie?"
>
> Assuming that's the correct interpretation, from the perspective of
> someone patching packages to make them reproducible, it no longer
> makes sense to prepare patches that address any usrmerge issues. At
> the very least, they will not be prioritised by package maintainers.

I still see usrmerge as a real-world (even if technically unsupported in
Debian) variation that detects real bugs... though I think even for the
bookworm release cycle, maintainers were hesitant to apply usrmerge
related patches.

Most of the usrmerge bugs I encounter often feel a bit like busywork,
but they are also usually fairly trivial to fix.

Off the top of my head, I do not know how many, but definitely some of
the usrmerge related bugs I have found will successfully build in a
usrmerge environment, but in a way that quietly breaks the package
(e.g. junk in manpages or other documentation, entirely missing
documentation, embedding an entirely sensical path, etc.). It would be
hard to systematically find these bugs without testing builds in
usrmerge and non-usrmerge environments. And I am not sure any other
project makes any more sense to do that sort of testing.


> I'm less clear about whether we should cease testing bookworm. It
> doesn't seem right for the CI to claim that various [bookworm]
> packages are "reproducible in bookworm", when the presence of usrmerge
> (or lack thereof) in bookworm means that they can still vary.

I find it a bit weird to change the variations after the release, as
then the stats will gradually change as packages get retested, without
any actual work done to change them... feels a bit too much like moving
the goalposts retroactively.


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Bug#1035645: reprotest: FTBFS with tox 4

2023-05-09 Thread Vagrant Cascadian
On 2023-05-09, stefa...@debian.org wrote:
> Hi Vagrant (2023.05.07_00:01:34_+)
>> This patch is trivial and fixes building with tox 4, but unfortunately
>> breaks building with tox 3:
>
> How about this, works on both, by using the multiline syntax:
>
> diff --git a/tox.ini b/tox.ini
> index 1c3488d..579e4bd 100644
> --- a/tox.ini
> +++ b/tox.ini
> @@ -10,7 +10,10 @@ commands =
>{envpython} -m coverage html
>  
>  [testenv]
> -passenv = REPROTEST_TEST_* VIRTUALENV_DOWNLOAD *_proxy TERM
> +passenv =
> +  REPROTEST_TEST_*
> +  VIRTUALENV_DOWNLOAD
> +  *_proxy,TERM
>  # usedevelop = True
>  deps =
>coverage

Thanks, tested and pushed a fix to git based on that idea!

live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Bug#1035645: reprotest: FTBFS with tox 4

2023-05-06 Thread Vagrant Cascadian
Control: tags 1035645 patch

On 2023-05-06, Stefano Rivera wrote:
> Log snippet:
> dh_auto_test -- --test-tox
> I: pybuild base:240: cd 
> /<>/.pybuild/cpython3_3.11_reprotest/build; tox -c /< BUILDDIR>>/tox.ini --sitepackages -e py311
> py311: failed with pass_env values cannot contain whitespace, use comma to 
> have multiple value
> s in a single line, invalid values found 'REPROTEST_TEST_* 
> VIRTUALENV_DOWNLOAD *_proxy TERM'
>   py311: FAIL code 1 (0.00 seconds)
>   evaluation failed :( (0.06 seconds)
> E: pybuild pybuild:388: test: plugin distutils failed with: exit code=1: cd 
> /<>/.
> pybuild/cpython3_3.11_reprotest/build; tox -c /<>/tox.ini 
> --sitepackages -e py311
> dh_auto_test: error: pybuild --test --test-tox -i python{version} -p 3.11 
> --test-tox returned exit code 13

This patch is trivial and fixes building with tox 4, but unfortunately
breaks building with tox 3:

From 8ee881b36a3578e652cc1693fd047692b1fa3fa9 Mon Sep 17 00:00:00 2001
From: Vagrant Cascadian 
Date: Sat, 6 May 2023 16:48:39 -0700
Subject: [PATCH] tox.ini: Fix build with tox 4 by using comma-separated
 values. (Closes: #1035645)

---
 tox.ini | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/tox.ini b/tox.ini
index 1c3488d..0e76251 100644
--- a/tox.ini
+++ b/tox.ini
@@ -10,7 +10,7 @@ commands =
   {envpython} -m coverage html
 
 [testenv]
-passenv = REPROTEST_TEST_* VIRTUALENV_DOWNLOAD *_proxy TERM
+passenv = REPROTEST_TEST_*,VIRTUALENV_DOWNLOAD,*_proxy,TERM
 # usedevelop = True
 deps =
   coverage
-- 
2.39.2


I am not terribly familiar with tox; is it plausible to create a patch
that works with both of them? Maybe a conditional in tox.ini?


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Bug#1030785: -ffile-prefix-map option and reproducibility

2023-02-08 Thread Vagrant Cascadian
On 2023-02-08, Stéphane Glondu wrote:
> Thank you all for your answers!
>
> Using:
>
>DEB_BUILD_MAINT_OPTIONS=reproducible=-fixfilepath,-fixdebugpath
>
> makes the package unreproducible in another way that seems difficult to fix.

Most likely reintroducing the things that the -ffile-prefix-map and
-fdebug-prefix-map was effectively removing...


We track these kinds of issues with the "records build flags" issue,
which has a description of the problem and links to more information:

  
https://tests.reproducible-builds.org/debian/issues/unstable/records_build_flags_issue.html

There are some potential fixes to the issue more fundamentally, but they
are currently stalled out... one of which I should probably spend some
time on after bookworm release...


You had earlier asked when this was enabled, this can mostly be found in
the dpkg changelog:

fixfilepath (a.k.a. -ffile-prefix-map) Enabled by default:

dpkg (1.20.6) unstable; urgency=medium
...

  * dpkg-buildflags: Enable reproducible=fixfilepath by default.  Thanks
to Vagrant Cascadian .  See
https://lists.debian.org/debian-devel/2020/10/msg00222.html.
Closes: #974087
...
 -- Guillem Jover   Fri, 08 Jan 2021 04:39:40 +0100


fixfilepath (a.k.a. -ffile-prefix-map) feature added, and enabled in
reproducible builds infrastructure soon after:

dpkg (1.19.1) unstable; urgency=medium
...
- Dpkg::Vendor::Debian: Add fixfilepath support to reproducible feature.
...
 -- Guillem Jover   Wed, 26 Sep 2018 15:13:22 +0200


fixdebugpath (a.k.a. -fdebug-prefix-map) enabled by default:

dpkg (1.18.10) unstable; urgency=medium
...
- Enable fixdebugpath build flag feature by default.
  Thanks to Mattia Rizzolo . Closes: #832179
...
 -- Guillem Jover   Sun, 31 Jul 2016 12:57:02 +0200


fixdebugpath (a.k.a. -fdebug-prefix-map) feature added, and presumably
enabled in reproducible builds infrastructure soon after:

dpkg (1.18.5) unstable; urgency=medium
...
- Add fixdebugpath to reproducible feature in Dpkg::Vendor::Debian.
  Thanks to Daniel Kahn Gillmor . Closes:
  #819194
...
 -- Guillem Jover   Mon, 02 May 2016 04:14:57 +0200


Of course, this is only for packages respecting dpkg-buildflags.


> Le 07/02/2023 à 19:12, Mattia Rizzolo a écrit :
>> I actually propose to you to filter out the whole option from being
>> saved. [...]
>
> I've gone this way, and managed to make the package reproducible, at 
> least with the build path variation.

Glad that works!


> I will upload the fixed ocaml package when the current batch of related 
> packages waiting in unstable migrates to testing, hopefully in 4 days.

Thanks!


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Bug#1012035: reprotest: Salsa CI randomly fails because dpkg-source can't change timestamp

2023-01-27 Thread Vagrant Cascadian
On 2022-06-01, Holger Levsen wrote:
> On Sun, May 29, 2022 at 08:26:18PM +0200, Samuel Thibault wrote:
>> Zhang Boyang, le dim. 29 mai 2022 14:10:35 +0800, a ecrit:
>> > I found Salsa CI reprotest on my repo fails when "FAKETIME variation:
>> > faketime = [balabala]" is decided. The relevant output is:
>> > 
>> > dpkg-source: error: cannot change timestamp for
>> > build-experiment-1/.pc/applied-patches: Invalid argument
>> I have seen reprotest randomly fail in the salsa CI for various packages
>> indeed.
>
> indeed, merging with "#961064: reprotest: should not default to vary time and 
> date"
> and raising the severity and tagging help because reprotest is (almost) 
> unmaintained
> upstream, and tagging newcomer as the fix should be really easy, so if you are
> looking for ways to start contributing to reproducible builds, please do! ;p 
> :)
>
> I'll certainly be happy to review, merge and upload.

Given that the overwhelming majority of reproducible builds issues are
time related, and regressions happen all the time, disabling this would
fail to catch many issues, for what I believe to be a small minority of
affected packages.  I have personally run thousands of builds using
reprotest, and only rarely have come across this issue.

It typically causes issues with packages that patch files at build time,
such as applying debian/patches/ ... but most definitely not all
patches.

The salsa-ci pipelines document how to add custom arguments to reprotest:

  
https://salsa.debian.org/salsa-ci-team/pipeline#adding-extra-arguments-to-reprotest

So adding to your configuration:

  variables:
SALSA_CI_REPROTEST_ARGS: --vary=-time

Might be good to update their documentation to use the --vary arguments
and include -time,-build_path ... I can maybe propose a patch to the
salsa-ci team for that.


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Bug#1029066: diffoscope: FTBFS if no internet is available (using internet connection during build)

2023-01-17 Thread Vagrant Cascadian
Control: found 1029066 230

On 2023-01-17, Gianfranco Costamagna wrote:
> Hello, your package FTBFS when internet is not available during control file 
> regeneration phase
...
> debian/tests/control.sh
> Generating the debian/tests/control file...
...
> ERROR: Could not find a version that satisfies the requirement setuptools 
> (from versions: none)
> ERROR: No matching distribution found for setuptools
> Traceback (most recent call last):
>File "", line 1, in 
>File "/usr/lib/python3/dist-packages/pep517/meta.py", line 72, in load
>  path = Path(build_as_zip(builder))
>  ^
>File "/usr/lib/python3/dist-packages/pep517/meta.py", line 59, in 
> build_as_zip
>  builder(dest=out_dir)
>File "/usr/lib/python3/dist-packages/pep517/meta.py", line 53, in build
>  env.pip_install(system['requires'])
>File "/usr/lib/python3/dist-packages/pep517/envbuild.py", line 103, in 
> pip_install
>  check_call(
>File "/usr/lib/python3.11/subprocess.py", line 413, in check_call
>  raise CalledProcessError(retcode, cmd)
> subprocess.CalledProcessError: Command '['/usr/bin/python3', '-m', 'pip', 
> 'install', '--ignore-installed', '--prefix', 
> '/tmp/pep517-build-env-i1vz1wye', 'setuptools', 'wheel']' returned non-zero 
> exit status 1.
> Files debian/tests/control and debian/tests/control.tmp differ
>
> The generated control file differs from the actual one.
> A sourceful upload of this package is needed.
>
> Differences:
> --- debian/tests/control  2023-01-13 07:05:01.0 +
> +++ debian/tests/control.tmp  2023-01-17 02:06:59.340564039 +
> @@ -7,7 +7,7 @@
>   #   $ mv debian/tests/control.tmp debian/tests/control
>   
>   Tests: pytest-with-recommends
> -Depends: python3-all, diffoscope, black, python3-pytest, python3-h5py, file, 
> linux-image-amd64 [amd64] | linux-image-generic [amd64], abootimg, acl, 
> apksigcopier, apksigner, apktool [!ppc64el !s390x], binutils-multiarch, 
> bzip2, caca-utils, colord, coreboot-utils, db-util, default-jdk-headless | 
> default-jdk | java-sdk, device-tree-compiler, docx2txt, e2fsprogs, enjarify, 
> ffmpeg, fontforge-extras, fonttools, fp-utils [!ppc64el !s390x], genisoimage, 
> gettext, ghc, ghostscript, giflib-tools, gnumeric, gnupg, gnupg-utils, 
> hdf5-tools, html2text, imagemagick, jsbeautifier, libarchive-tools, 
> libxmlb-dev, llvm, lz4 | liblz4-tool, lzip, mono-utils, ocaml-nox, odt2txt, 
> oggvideotools [!s390x], openssh-client, openssl, pgpdump, poppler-utils, 
> procyon-decompiler, python3-pdfminer, r-base-core, rpm2cpio, sng, sqlite3, 
> squashfs-tools, tcpdump, u-boot-tools, unzip, wabt, xmlbeans, xxd, xz-utils, 
> zip, zstd, androguard, python3-argcomplete, python3-binwalk, 
> python3-defusedxml, python3-distro, python3-guestfs, python3-jsondiff, 
> python3-progressbar, python3-pypdf, python3-debian, python3-pyxattr, 
> python3-rpm, python3-tlsh
> +Depends: python3-all, diffoscope, black, python3-pytest, python3-h5py, file, 
> linux-image-amd64 [amd64] | linux-image-generic [amd64], abootimg, acl, 
> apksigcopier, apksigner, apktool [!ppc64el !s390x], binutils-multiarch, 
> bzip2, caca-utils, colord, coreboot-utils, db-util, default-jdk-headless | 
> default-jdk | java-sdk, device-tree-compiler, docx2txt, e2fsprogs, enjarify, 
> ffmpeg, fontforge-extras, fonttools, fp-utils [!ppc64el !s390x], genisoimage, 
> gettext, ghc, ghostscript, giflib-tools, gnumeric, gnupg, gnupg-utils, 
> hdf5-tools, html2text, imagemagick, jsbeautifier, libarchive-tools, 
> libxmlb-dev, llvm, lz4 | liblz4-tool, lzip, mono-utils, ocaml-nox, odt2txt, 
> oggvideotools [!s390x], openssh-client, openssl, pgpdump, poppler-utils, 
> procyon-decompiler, python3-pdfminer, r-base-core, rpm2cpio, sng, sqlite3, 
> squashfs-tools, tcpdump, u-boot-tools, unzip, wabt, xmlbeans, xxd, xz-utils, 
> zip, zstd,

Confirmed that it also affects the version in bookworm:

  
https://tests.reproducible-builds.org/debian/rb-pkg/bookworm/amd64/diffoscope.html

And according to test history, may go back to much earlier versions
(222+)... although there were some other possible FTBFS issues that may
have affected older versions, and I don't immediately see a way to look
at the logs from older versions.

live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Bug#828683: mc: please make the build reproducible

2023-01-10 Thread Vagrant Cascadian
On 2022-12-22, Vagrant Cascadian wrote:
> On 2022-10-06, Vagrant Cascadian wrote:
>> The attached alternate implements this for mc by touching the potential
>> files before running configure with a consistent timestamp.
>>
>> According to my local tests, applying this patch should make mc build
>> reproducibly once it lands in testing/bookworm! There are outstanding
>> build path issues tested in unstable and experimental.
> ...
>> From 4e69587954d29ec6bfc7d85b4b618724b16b840e Mon Sep 17 00:00:00 2001
>> From: Vagrant Cascadian 
>> Date: Thu, 6 Oct 2022 19:25:11 +
>> Subject: [PATCH 5/5] debian/rules: Ensure consistent timestamp on manpages.
>>
>> The upstream build system uses the file modification time of the .in
>> file for the manpage to embed into the generatated manpage, but if
>> debian/patches modify the .in files, the timestamp is updated,
>> resulting in builds performed in different months or years embedding a
>> different date.
>> ---
>>  debian/rules | 2 ++
>>  1 file changed, 2 insertions(+)
>>
>> diff --git a/debian/rules b/debian/rules
>> index c633a73..bf8eb5e 100755
>> --- a/debian/rules
>> +++ b/debian/rules
>> @@ -25,6 +25,8 @@ override_dh_autoreconf:
>>  # AWK="awk" is inheritance of 4.7.* series, see 
>> http://bugs.debian.org/499723
>>  # might be still necessary for extfs scripts
>>  override_dh_auto_configure:
>> +# Ensure reproducible timestamp on manpages
>> +touch -d@$(SOURCE_DATE_EPOCH) doc/man/*.1.in doc/man/*/*.1.in
>>  dh_auto_configure -- AWK="awk" X11_WWW="x-www-browser" \
>>  --libexecdir='/usr/lib' \
>>  --with-x \
>> -- 
>> 2.37.2
>
> I Intend to NMU with this patch on the 29th (with a 10-day delay) unless
> I hear otherwise.

A little behind schedule, and a little more conservative patch (only
touching the currently patched mcedit manpage)...

Uploaded NMU to DELAYED/10:

diff -Nru mc-4.8.28/debian/changelog mc-4.8.28/debian/changelog
--- mc-4.8.28/debian/changelog  2022-04-02 15:10:15.0 -0700
+++ mc-4.8.28/debian/changelog  2023-01-10 09:50:36.0 -0800
@@ -1,3 +1,11 @@
+mc (3:4.8.28-1.1) unstable; urgency=medium
+
+  * Non-maintainer upload.
+  * debian/rules: Ensure consistent timestamp on mcedit manpage.
+(Closes: #828683)
+
+ -- Vagrant Cascadian   Tue, 10 Jan 2023 
09:50:36 -0800
+
 mc (3:4.8.28-1) unstable; urgency=medium
 
   * New upstream release.
diff -Nru mc-4.8.28/debian/rules mc-4.8.28/debian/rules
--- mc-4.8.28/debian/rules  2018-06-05 21:11:33.0 -0700
+++ mc-4.8.28/debian/rules  2023-01-10 09:49:58.0 -0800
@@ -25,6 +25,9 @@
 # AWK="awk" is inheritance of 4.7.* series, see http://bugs.debian.org/499723
 # might be still necessary for extfs scripts
 override_dh_auto_configure:
+   # Ensure reproducible timestamp on mcedit manpage, which is
+   # patched from debian/patches
+   touch -d@$(SOURCE_DATE_EPOCH) doc/man/mcedit.1.in
dh_auto_configure -- AWK="awk" X11_WWW="x-www-browser" \
--libexecdir='/usr/lib' \
--with-x \


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Debian NMU Sprint Tuesday, January 10th, 16:00 UTC!

2023-01-04 Thread Vagrant Cascadian
First Debian NMU Sprint of 2023... this coming Tuesday, January 10th,
16:00 UTC!

Some past sprints:

  
https://lists.reproducible-builds.org/pipermail/rb-general/2022-November/002756.html

IRC:

  irc.oftc.net #debian-reproducible

Unapplied patches:

  
https://udd.debian.org/bugs/?release=sid=only=ign=ign=ign=1

Documentation about performing NMUs:

  https://www.debian.org/doc/manuals/developers-reference/pkgs.html#nmu

If you are impatient, try fixing QA packages, as you can upload fixes
without delays:

  
https://tests.reproducible-builds.org/debian/unstable/amd64/pkg_set_maint_debian-qa.html


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Salsa reprotest CI not picky enough? (Was: Bug#1026877: opari2: please make the build reproducible)

2022-12-23 Thread Vagrant Cascadian
On 2022-12-24, Samuel Thibault wrote:
> Chris Lamb, le ven. 23 déc. 2022 05:32:47 +, a ecrit:
>> Whilst working on the Reproducible Builds effort [0] we noticed that
>> opari2 could not be built reproducibly.
>> 
>> Patch attached that exports CFLAGS from dpkg-buildflags(1), ensuring
>> that -fdebug-prefix-map (and similar) to the underlying build system.
>
> I'm wondering how it is that the salsa reprotest CI didn't catch it:
>
> https://salsa.debian.org/debian/opari2/-/jobs/3675809
>
> Is there some salsa-ci option that I should enable to make it more
> picky?

  https://salsa.debian.org/debian/opari2/-/jobs/3696823#L1036

  INFO:reprotest:build "experiment-1": FIX environment, FIX build_path, ...

Something is telling reprotest to not vary build paths. I also see weird
things suggesting the standard build path reprotest uses is duplicated:

  https://salsa.debian.org/debian/opari2/-/jobs/3696823#L473

  make[1]: Entering directory 
'/tmp/reprotest.57Bwiu/const_build_path/const_build_path'

Normally, there would just be a single "const_build_path" subdir. I
noticed this issue on one of the packages I maintain, but haven't poked
at it yet.

This suggests to me the salsa-ci reprotest jobs are somehow manually
adjusting the build paths in some way above-and-beyond how reprotest
normally varies the build path? We probably should follow-up with the
salsa-ci folks...


The example for the salsa-ci suggests that build paths variations are
enabled by default under the "Adding extra arguments to reprotest":

  
https://salsa.debian.org/salsa-ci-team/pipeline/-/tree/master#adding-extra-arguments-to-reprotest

  variables:
SALSA_CI_REPROTEST_ARGS: --variations=-build_path

You could try inverting it, and passing --variations=+build_path


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Last Debian NMU Sprint of the year, December, 29th Thursday 17:00 UTC!

2022-12-22 Thread Vagrant Cascadian
On 2022-11-20, Vagrant Cascadian wrote:
> Since the previous sprints were fun and productive, I am planning on
> doing NMU sprints every Thursday in December (1st, 8th, 15th, 22nd,
> 29th). We are planning on meeting on irc.oftc.net in the
> #debian-reproducible channel at 17:00UTC and going for an hour or two or
> three. Feel free to start early or stay late, or even fix things on some
> other day!

Last call! December 29th, 17:00 UTC...

> We will have Debian Developers available to sponsor uploads, so even if
> you can't upload yourself but you know how to build a debian package,
> please join us!
>
> Unapplied patches:
>
>   
> https://udd.debian.org/bugs/?release=sid=only=ign=ign=ign=1

We have finally closed more patches than we opened! When we started
doing these NMU sprints back in september, it was around ~250 unapplied
patches... and it is down to 249! Great marketing, I know.

I think most remaining patches are *only* two or three years stale...

We might continue these in 2023 and switch up the days and times, and of
course, feel free to go solo and upload an NMU outside of our these
sprints!


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Bug#828683: mc: please make the build reproducible

2022-12-22 Thread Vagrant Cascadian
On 2022-10-06, Vagrant Cascadian wrote:
> The attached alternate implements this for mc by touching the potential
> files before running configure with a consistent timestamp.
>
> According to my local tests, applying this patch should make mc build
> reproducibly once it lands in testing/bookworm! There are outstanding
> build path issues tested in unstable and experimental.
...
> From 4e69587954d29ec6bfc7d85b4b618724b16b840e Mon Sep 17 00:00:00 2001
> From: Vagrant Cascadian 
> Date: Thu, 6 Oct 2022 19:25:11 +
> Subject: [PATCH 5/5] debian/rules: Ensure consistent timestamp on manpages.
>
> The upstream build system uses the file modification time of the .in
> file for the manpage to embed into the generatated manpage, but if
> debian/patches modify the .in files, the timestamp is updated,
> resulting in builds performed in different months or years embedding a
> different date.
> ---
>  debian/rules | 2 ++
>  1 file changed, 2 insertions(+)
>
> diff --git a/debian/rules b/debian/rules
> index c633a73..bf8eb5e 100755
> --- a/debian/rules
> +++ b/debian/rules
> @@ -25,6 +25,8 @@ override_dh_autoreconf:
>  # AWK="awk" is inheritance of 4.7.* series, see http://bugs.debian.org/499723
>  # might be still necessary for extfs scripts
>  override_dh_auto_configure:
> + # Ensure reproducible timestamp on manpages
> + touch -d@$(SOURCE_DATE_EPOCH) doc/man/*.1.in doc/man/*/*.1.in
>   dh_auto_configure -- AWK="awk" X11_WWW="x-www-browser" \
>   --libexecdir='/usr/lib' \
>   --with-x \
> -- 
> 2.37.2

I Intend to NMU with this patch on the 29th (with a 10-day delay) unless
I hear otherwise.


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Debian NMU Sprints in December, Thursdays 17:00 UTC!

2022-11-20 Thread Vagrant Cascadian
Since the previous sprints were fun and productive, I am planning on
doing NMU sprints every Thursday in December (1st, 8th, 15th, 22nd,
29th). We are planning on meeting on irc.oftc.net in the
#debian-reproducible channel at 17:00UTC and going for an hour or two or
three. Feel free to start early or stay late, or even fix things on some
other day!

We will have Debian Developers available to sponsor uploads, so even if
you can't upload yourself but you know how to build a debian package,
please join us!

Unapplied patches:

  
https://udd.debian.org/bugs/?release=sid=only=ign=ign=ign=1

This list is sorted by the oldest bugs with patches not marked pending,
so we can target bugs that have just stalled out for whatever reason,
but feel free to pick bugs that scratch your particular itch.

We will want to make sure the patch still applies and/or refresh the
patches, make sure it still solves the issue, and update the bug report
where appropriate.

Documentation about performing NMUs:

  https://www.debian.org/doc/manuals/developers-reference/pkgs.html#nmu

We will be uploading the to the DELAYED queue (presumably between 10 and
15 days).

If the package has been orphaned we can generally upload without delay
(check the https://tracker.debian.org/PACKAGE page which usually lists
this) and mark it as maintained by "Debian QA Group
" if needed.

If you are impatient, try fixing QA packages, as you can upload fixes
without delays:

  
https://tests.reproducible-builds.org/debian/unstable/amd64/pkg_set_maint_debian-qa.html


Let's fix some bugs!


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Debian NMU Sprint Thursday November 17th 17:00 UTC!

2022-11-11 Thread Vagrant Cascadian
On 2022-11-11, Chris Lamb wrote:
> Can you clarify whether you meant *Wednesday* November 16th or
> Thursday November *17th*? :)

Oops! Thursday November 17th!

live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Debian NMU Sprint Thursday November 17th 17:00 UTC!

2022-11-11 Thread Vagrant Cascadian
On 2022-11-11, Chris Lamb wrote:
> Can you clarify whether you meant *Wednesday* November 16th or
> Thursday November *17th*? :)

Oops! The 17th!

live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Debian NMU Sprint Thursday November 16th 17:00 UTC!

2022-11-10 Thread Vagrant Cascadian
We were productive and had some fun with the previous NMU sprints:

  
https://lists.reproducible-builds.org/pipermail/rb-general/2022-September/002689.html

So we are planning on meeting on irc.oftc.net in the
#debian-reproducible channel at 17:00UTC and going for an hour or two or
three.

We will have Debian Developers available to sponsor uploads, so even if
you can't upload yourself but you know how to build a debian package,
please join us!

Unapplied patches (currently 287):

  
https://udd.debian.org/bugs/?release=sid=only=ign=ign=ign=1

This list is sorted by the oldest bugs with patches not marked pending,
so we can target bugs that have just stalled out for whatever reason,
but feel free to pick bugs that scratch your particular itch.

We will want to make sure the patch still applies and/or refresh the
patches, make sure it still solves the issue, and update the bug report
where appropriate.

Documentation about performing NMUs:

  https://www.debian.org/doc/manuals/developers-reference/pkgs.html#nmu

We will be uploading the to the DELAYED queue (presumably between 10 and
15 days).

If the package has been orphaned we can generally upload without delay
(check the https://tracker.debian.org/PACKAGE page which usually lists
this) and mark it as maintained by "Debian QA Group
" if needed.


Let's fix some bugs!


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Debian NMU Sprint Thursday 16:00 UTC!

2022-11-08 Thread Vagrant Cascadian
On 2022-11-08, Chris Lamb wrote:
>> > We are planning on meeting on irc.oftc.net in the #debian-reproducible
>> > channel at 16:00UTC and going for an hour or two or three.
>>
>> It was fun, so we hope to do this roughly every two weeks!
>> Next one is thus planned for Thursday, October 6th, 16:00 UTC!
>
> I enjoyed the sprint on October 6th and found it both fun and
> productive; can we schedule another one...?

Basically, I'm usually up for it any Thursday at that time slot. Two
days might be a bit too soon to schedule(though I would do it if someone
else wanted to!)...

  how about the November 17th and December 1st? 16:00 UTC or 17:00 UTC?

I did a solo one on October 20th and still got a couple NMUs in:

  msp430mcu_20120406-2.3_source.ftp-master.upload
  checkpw_1.02-1.2_source.ftp-master.upload

And did the legwork that lead to a QA upload the following day:

  madlib_1.3.0-3_source.ftp-master.upload


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Bug#828683: mc: please make the build reproducible

2022-10-06 Thread Vagrant Cascadian
On 2016-06-27, Yury V. Zaytsev wrote:
> On Mon, 27 Jun 2016, Reiner Herrmann wrote:

Going through some old outstanding reproducible builds related bugs...

>> You are right, the mtime of the file is used for the manpage timestamp. 
>> But there is still a patch modifying the manpage: 
>> mcedit_full_path.patch. When this patch is upstreamed, the 
>> SOURCE_DATE_EPOCH patch is not needed, though it would still help when 
>> the manpages are patched.
>
> Thank you for the confirmation that my guesses are correct!
>
> However, I'm hesitant to apply SOURCE_DATE_EPOCH to anything that looks 
> like a nail. I believe that patching of upstream sources by Debian is such 
> an oft occurring situation, that I would rather look for at solution at 
> the packaging level. After all, builds from *upstream* sources are 
> *already* reproducible, in this case, it's the Debian build procedure that 
> is injecting randomness here.

I am biased, but I do think using SOURCE_DATE_EPOCH could still be a
reasonable approach upstream. There are some examples of how to use this
in various languages:

  https://reproducible-builds.org/docs/source-date-epoch/

I did notice some things in the upstream code
doc/man/date-of-man-include.am:

MAN_DATE_CMD = \
LC_ALL=$(DATE_LANG) @PERL@ -CS -MPOSIX -e '\
@fi=lstat("'$${MAN_FILE}'"); \
print POSIX::strftime("$(DATE_FORMAT)", localtime($$fi[9]));' 
2>/dev/null

This uses localtime, which could be affected by timezone differences. To
my surprise, it doesn't appear to affect the results, as only the
patched manpage is affected:

  
https://tests.reproducible-builds.org/debian/rb-pkg/bookworm/amd64/diffoscope-results/mc.html

Setting LC_ALL explicitly avoids variations due to the build
environment's locale(yay!), although the specified locales are not
guaranteed to be available (adding locales-all to build-depends could
fix that).

Again, surprisingly, this has not appeared to affect the results for the
locales tested on tests.reproducible-builds.org, though that may mean
the default locale is used even for the translated pages...


> For instance, it seems to me that it is only logical to set the mtime of 
> patched files to the mtime of the last patch that touched them, and this 
> will make the source mtime dependent builds fully reproducible.

The attached alternate implements this for mc by touching the potential
files before running configure with a consistent timestamp.

According to my local tests, applying this patch should make mc build
reproducibly once it lands in testing/bookworm! There are outstanding
build path issues tested in unstable and experimental.

It doesn't address the larger issue of files modified by debian/patches
having the current timestamp, which is a much more complicated
intersection of issues.


live well,
  vagrant
From 4e69587954d29ec6bfc7d85b4b618724b16b840e Mon Sep 17 00:00:00 2001
From: Vagrant Cascadian 
Date: Thu, 6 Oct 2022 19:25:11 +
Subject: [PATCH 5/5] debian/rules: Ensure consistent timestamp on manpages.

The upstream build system uses the file modification time of the .in
file for the manpage to embed into the generatated manpage, but if
debian/patches modify the .in files, the timestamp is updated,
resulting in builds performed in different months or years embedding a
different date.
---
 debian/rules | 2 ++
 1 file changed, 2 insertions(+)

diff --git a/debian/rules b/debian/rules
index c633a73..bf8eb5e 100755
--- a/debian/rules
+++ b/debian/rules
@@ -25,6 +25,8 @@ override_dh_autoreconf:
 # AWK="awk" is inheritance of 4.7.* series, see http://bugs.debian.org/499723
 # might be still necessary for extfs scripts
 override_dh_auto_configure:
+	# Ensure reproducible timestamp on manpages
+	touch -d@$(SOURCE_DATE_EPOCH) doc/man/*.1.in doc/man/*/*.1.in
 	dh_auto_configure -- AWK="awk" X11_WWW="x-www-browser" \
 		--libexecdir='/usr/lib' \
 		--with-x \
-- 
2.37.2



signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Debian NMU Sprint Thursday 16:00 UTC!

2022-09-22 Thread Vagrant Cascadian
On 2022-09-21, Vagrant Cascadian wrote:
> We are planning on meeting on irc.oftc.net in the #debian-reproducible
> channel at 16:00UTC and going for an hour or two or three.

It was fun, so we hope to do this roughly every two weeks!
Next one is thus planned for Thursday, October 6th, 16:00 UTC!

> We will have at least two Debian Developers available to sponsor
> uploads, so even if you can't upload yourself but you know how to build
> a debian package, please join us!
>
>
> Unapplied patches (currently ~250):
>
>   
> https://udd.debian.org/bugs/?release=sid=only=ign=ign=ign=7=7=1=last_modified=asc=html#results

bugs closed: 1
bugs marked pending: 7
bugs followup: 2

Excluding bugs marked pending, I think we're down to 232 patches to
upload, ideally before the freeze in January.

Seemed like Holger focused on bugs with a high popularity contest score,
and I focused on bugs that haven't seen activity since 2015!


h01ger:
- mailed #1010957 asking for an update and whether to remove the patch tag 
for now
- uploaded src:gmp to DELAYED/15 fixing #1009931
- mailed  #1017372 and asked for maintainer opinion on the patch
- uploaded src:time to DELAYED/15 fixing #983202

vagrant:
- verify and update patch for mylvmbackup https://bugs.debian.org/782318
- uploaded mylvmbackup to DELAYED/10
- verify/update patches for libranlip
  https://bugs.debian.org/788000
  https://bugs.debian.org/846975
  https://bugs.debian.org/1007137
- uploaded libranlip to DELAYED/10
- verified patch for cclive https://bugs.debian.org/824501
- uploaded cclive to DELAYED/10
- was unable to reproduce the issue with two patches:
  #791423 linuxtv-dvb-apps: please make the build reproducible
Marked as done
  #794398 clhep: please make the build reproducible
Uncertain of status


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Debian NMU Sprint Thursday 16:00 UTC!

2022-09-21 Thread Vagrant Cascadian
Holger and I were chatting about doing more Debian NMUs
(Non-Maintainer-Uploads) to clear the huge backlog of reproducible
builds patches submitted... and we may as well get started this
Thursday!

We are planning on meeting on irc.oftc.net in the #debian-reproducible
channel at 16:00UTC and going for an hour or two or three.

We will have at least two Debian Developers available to sponsor
uploads, so even if you can't upload yourself but you know how to build
a debian package, please join us!


Unapplied patches (currently ~250):

  
https://udd.debian.org/bugs/?release=sid=only=ign=ign=ign=7=7=1=last_modified=asc=html#results

The list is sorted by activity, so we can target bugs that have just
stalled out for whatever reason, but feel free to pick bugs that scratch
your particular itch.

We will want to make sure the patch still applies and/or refresh the
patches, make sure it still solves the issue, and update the bug report
where appropriate.


Documentation about performing NMUs:

  https://www.debian.org/doc/manuals/developers-reference/pkgs.html#nmu

We will be uploading the to the DELAYED queue (presumably between 10 and
15 days).


If this is fun and productive, we might keep doing this approximately
once or twice a month!


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Bug#1019742: reprotest: add a variation that sets DEB_BUILD_OPTIONS=nocheck

2022-09-14 Thread Vagrant Cascadian
On 2022-09-14, Philip Hands wrote:
> Vagrant Cascadian  writes:
>
>>> but also
>>> (given that the tests will have passed during the normal build) the tests
>>> failing during the varied build seems unlikely to be identifying faults 
>>> that are
>>> worth fixing, and so is just a waste of cycles.
>>
>> How do you know weather the bugs it is identifying are worth fixing? It
>> could also identify non-deterministic failures, or failures triggered by
>> specific build environment configurations...
>
> The point is that if the package is reproducible, then the fact that its
> tests fail when run in a weird environment (that may never be found in
> the wild) seems rather likely to be finding errors in the tests rather
> than errors in the program that gets shipped.

Fair point! 

> Of course, if the package is not reproducible, the tests may well fail
> because the package ends up containing new bugs that are only present in
> the variant-built package, but then its also going to show up as
> non-reproducible, so does that really make a difference?

True, though it may make things harder to verify reproducibility in
practice, especially if it is a fairly "normal" variation that triggers
the issue...


It is a balancing act...

I guess I'd be fine with the defaults to go either way, but it would be
important to be able to enable or disable however this gets implemented.


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Bug#1019742: reprotest: add a variation that sets DEB_BUILD_OPTIONS=nocheck

2022-09-14 Thread Vagrant Cascadian
On 2022-09-14, Philip Hands wrote:
> I suggest adding a 'nocheck' variation, that sets DEB_BUILD_OPTIONS=nocheck
> during the build,

Sounds reasonable!

> and enabling it by default.

Less sure...

> The reason for doing so is that one could imagine that a package produces
> differing results depending upon whether the tests were run or not

Indeed!

> but also
> (given that the tests will have passed during the normal build) the tests
> failing during the varied build seems unlikely to be identifying faults that 
> are
> worth fixing, and so is just a waste of cycles.

How do you know weather the bugs it is identifying are worth fixing? It
could also identify non-deterministic failures, or failures triggered by
specific build environment configurations...

> This idea is prompted by `busybox` where the tests fail in the varied 
> scenario,
> despite the fact that the package is reproducible.

> Here they are failing:
>
>   https://salsa.debian.org/installer-team/busybox/-/jobs/3227197
>
>   (among other things, du produces weird results when the `fileordering`
>variation is active, claiming the 1MB directoy is 2MB so the tests fail, so
>the varied package is not produced, so we don't get to see that it was
>reproducible:
>
> https://salsa.debian.org/installer-team/busybox/-/blob/master/testsuite/du/du-m-works
>)

Consistantly? Then, maybe the test needs to be improved?


> I found a couple of ways of making the issue go away:
>
>   1) disabling the 'fileordering' variation, thus:
>
> 
> https://salsa.debian.org/installer-team/busybox/-/commit/17387890c73388e1f56a6ae9fbc79783095b4e86
>
> https://salsa.debian.org/installer-team/busybox/-/jobs/3233259
>
>   2) telling the package to skip the tests when doing the variations:
>
> 
> https://salsa.debian.org/installer-team/busybox/-/commit/5260442e8ceea220fa36bdda169978d15108f781
>
> which is setting this in the salsa-ci.yml:
>   SALSA_CI_REPROTEST_ARGS: 
> --variations=environment.variables+=DEB_BUILD_OPTIONS=nocheck
>
> https://salsa.debian.org/installer-team/busybox/-/jobs/3235476
>
>
> Option 2) is what I'm suggesting making into a default variation.
>
> If nothing else it will speed up testing of packages with extensive test 
> suits.

I think it's a valuable feature, but I'm not entirely sure weather it
should be default or not...


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Bug#1012318: diffoscope 214 produced no output and was killed after running into timeout after 150m

2022-06-07 Thread Vagrant Cascadian
On 2022-06-06, Chris Lamb wrote:
> Vagrant Cascadian wrote:
> It's not strictly a *bug* that diffoscope takes a long time, but it is
> curious that the best-effort "--timeout" is not kicking in early enough
> and ensuring that the harsher timeout does not kill diffoscope outright.
>
> Indeed, the --timeout option was implemented precisely to avoid having
> no output whatsoever on tests.r-b.org.

Nevertheless, it could get stuck on some arbitrary file, no? Or does it
actually try to abort processing a file if it is taking too long?


> Vagrant, do you have --profile output, out of interest? It would be
> interesting, for instance, if the HTML generation was taking so long
> that even if diffoscope stopped recursing into subarchives, it hit
> Jenkins' hard timeout. This may suggest that there is a bug in the
> HTML generation, and/or that the --timeout value on tests.r-b.org has
> not been set low enough relative to the hard timeout.

Yeah, I think maybe we should add more time to the hard jenkins timeout
to make sure it really gets *some* output.

The original was just the defaults that reprotest uses (perhaps we
should add --profile?).

I re-ran diffoscope with profile output using the same artifacts

# Profiling output for: /usr/bin/diffoscope --progress --profile 
--text=halide.diffoscope control/ experiment-1

## close_archive (total time: 0.001s)
   0.000s 24 callsdiffoscope.comparators.xz.XzContainer
   0.000s 22 callsdiffoscope.comparators.tar.TarContainer
   0.000s  6 callsdiffoscope.comparators.deb.DebContainer
   0.000s 10 callsdiffoscope.comparators.deb.DebTarContainer

## command (total time: 1816.184s)
 805.925s   8949 callsreadelf
 770.865s   2011 callsdiff
 232.245s   1188 callsobjdump
   5.085s 36 callsxz
   2.002s198 callsstrings
   0.026s  2 callscmp
   0.026s  2 callscmp (external)
   0.004s 54 callscmp (internal)
   0.003s 66 callsstat
   0.002s 18 callsobjcopy

## compare_files (cumulative) (total time: 3904.923s)
 922.891s  6 callsabc.DebFile
 922.341s 12 callsabc.XzFile
 916.800s  6 callsabc.DebDataTarFile
 873.202s 18 callsabc.ElfFile
 183.319s 68 callsdiffoscope.comparators.elf.ElfCodeSection
  46.212s338 callsdiffoscope.comparators.elf.ElfSection
  29.528s   1047 callsabc.TextFile
  10.056s 34 callsdiffoscope.comparators.elf.ElfStringSection
   0.446s  6 callsabc.TarFile
   0.126s  6 callsabc.Md5sumsFile
   0.000s  1 call 
diffoscope.comparators.directory.FilesystemDirectory

## container_extract (total time: 6.318s)
   5.089s 36 callsdiffoscope.comparators.xz.XzContainer
   1.109s   6388 callsdiffoscope.comparators.deb.DebTarContainer
   0.098s 48 callsdiffoscope.comparators.deb.DebContainer
   0.022s 40 callsdiffoscope.comparators.tar.TarContainer

## has_same_content_as (total time: 0.109s)
   0.071s   1057 callsabc.TextFile
   0.016s  9 callsabc.DebFile
   0.013s  6 callsabc.DebDataTarFile
   0.003s338 callsdiffoscope.comparators.elf.ElfSection
   0.002s 18 callsabc.ElfFile
   0.001s 12 callsabc.Md5sumsFile
   0.001s 12 callsabc.XzFile
   0.001s  6 callsabc.TarFile
   0.001s 68 callsdiffoscope.comparators.elf.ElfCodeSection
   0.000s  3 calls
diffoscope.comparators.utils.libarchive.LibarchiveSymlink
   0.000s 34 callsdiffoscope.comparators.elf.ElfStringSection
   0.000s  1 call 
diffoscope.comparators.directory.FilesystemDirectory

## main (total time: 935.422s)
 935.417s  2 callsoutputs
   0.005s  1 call cleanup

## open_archive (total time: 0.001s)
   0.000s 36 callsdiffoscope.comparators.xz.XzContainer
   0.000s 24 callsdiffoscope.comparators.tar.TarContainer
   0.000s 12 callsdiffoscope.comparators.deb.DebTarContainer
   0.000s 12 callsdiffoscope.comparators.deb.DebContainer

## output (total time: 12.131s)
  12.131s  1 call text

## recognizes (total time: 8.809s)
   7.074s  51970 calls
diffoscope.comparators.utils.libarchive.LibarchiveMember
   1.111s   7436 callsdiffoscope.comparators.elf.ElfSection
   0.240s   1496 callsdiffoscope.comparators.elf.ElfCodeSection
   0.221s   1488 callsdiffoscope.comparators.utils.archive.ArchiveMember
   0.119s748 callsdiffoscope.comparators.elf.ElfStringSection
   0.043s468 callsdiffoscope.comparators.binary.FilesystemFile
   0.001s 96 callsabc.Md5sumsFile
   0.000s 12 calls
diffoscope.comparators.utils.libarchive.LibarchiveSymlink

## specialize (total time:

Bug#1012318: diffoscope 214 produced no output and was killed after running into timeout after 150m

2022-06-06 Thread Vagrant Cascadian
On 2022-06-03, Vagrant Cascadian wrote:
> On 2022-06-04, Roman Lebedev wrote:
>> it would appear, diffoscope is failing when runing agains halide package:
>>
>> https://tests.reproducible-builds.org/debian/rb-pkg/unstable/amd64/diffoscope-results/halide.html
>>
>> Unfortunately, i do not know what the more specific problem is.
...
> I'll see if I can find you some diffoscope output for halide by running
> it on a fairly powerful machine I have access to...

https://people.debian.org/~vagrant/reproducible/halide.diffoscope.out.zst

It's under 200MB download, expands to ~2GB uncompresed...


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Bug#1012318: diffoscope 214 produced no output and was killed after running into timeout after 150m

2022-06-03 Thread Vagrant Cascadian
On 2022-06-04, Roman Lebedev wrote:
> it would appear, diffoscope is failing when runing agains halide package:
>
> https://tests.reproducible-builds.org/debian/rb-pkg/unstable/amd64/diffoscope-results/halide.html
>
> Unfortunately, i do not know what the more specific problem is.

Some things just take a long time to run diffoscope, and
tests.reproducible-builds.org only lets diffoscope run for 150m and then
kills it to avoid very large unreproducible packages from eating all the
resources.

Options would be to run diffoscope on two builds of halide locally. The
tool reprotest does this and runs diffoscope by default if it is
installed.

There is also beta.tests.reproducible-builds.org which may not have a
diffoscope timeout, so you might be able to find diffoscope output for
halide there.

There may be improvements that could make diffoscope run faster and
avoid the timeout, though I'm not sure this is a fundamental problem of
diffoscope.

I'll see if I can find you some diffoscope output for halide by running
it on a fairly powerful machine I have access to...


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Including build metadata in packages

2022-05-07 Thread Vagrant Cascadian
On 2022-02-16, Simon McVittie wrote:
> On Sun, 13 Feb 2022 at 14:13:10 -0800, Vagrant Cascadian wrote:
>> Obviously, this would interfere with any meaningful reproducible builds
>> testing for any package that did something like this. Ideally metadata
>> like this about a build should *not* be included in the .deb files
>> themselves.
>
> Relatedly, I would like to be able to capture some information about
> builds even if (perhaps especially if) the build fails. It might make
> sense to combine that with what you're looking at. It doesn't seem
> ideal that for a successful build, the maintainer can recover detailed
> results via a .deb (at a significant reproducibility cost), but for a
> failing build - perhaps one that fails as a result of test regressions
> - they get no information other than what's in the log. If anything,
> these artifacts seem *more* important for failing builds.

Yes, thanks for bringing up the potential for getting better information
out of failed builds; that is a valueable angle to consider!


>> * Split build metadata into a separate file or archive
>> 
>> Some of the debian-installer packages generate tarballs that are not
>> .deb files and are included in the .changes files when uploading to the
>> archive; making a similar generalized option for other packages to put
>> build metadata into a separate artifact might be workable approach,
>> although this would presumably require toolchain changes in dpkg and dak
>> at the very least, and might take a couple release cycles, which
>> is... well, debian.
>> 
>> The possibility of bundling up .buildinfo files into this metadata too,
>> while taking some changes in relevent dpkg, dak, etc. tooling, might in
>> the long term be worth exploring.
>> 
>> There was a relevent bug report in launchpad:
>> 
>>   https://bugs.launchpad.net/launchpad/+bug/1845159
>> 
>> This seems like the best long-term approach, but pretty much *only* a
>> long-term approach...
>
> I think even if we do one of the other approaches as a stopgap, we'll
> want this in the long term.
>
> There are two approaches that could be taken to this. One is to use
> BYHAND, as Paul Wise already discussed. This would require action from the
> ftp team and dak (I think), but nothing special in sbuild or the buildd
> infrastructure.
>
> However, I'd prefer it if this was output from the build alongside the log,
> instead of being exported via the .changes file, so that failing builds
> can also produce artifacts, to help the maintainer and/or porters to
> figure out why the build failed. This would require action in sbuild and
> the buildd infrastructure, but not in dak, because handling build logs is
> not dak's job (and I don't think handling things like the binutils test
> results should be dak's job either).
>
> Here's a straw-man spec, which I have already prototyped in
> <https://salsa.debian.org/debian/sbuild/-/merge_requests/14>:
>
> Each whitespace-separated token in the Build-Artifacts field represents
> a filename pattern in the same simplified shell glob syntax used in
> "Machine-readable debian/copyright file", version 1.0.
>
> If the pattern matches a directory, its contents are included in
> the artifacts, recursively. If a pattern matches another file type,
> it is included in the artifacts as-is. If a pattern does not match
> anything, nothing is included in the artifacts: this may be diagnosed
> with a warning, but is not an error.
>
> If a pattern matches files outside the build directory, is an absolute
> path or contains ".." segments,  build tools may exclude those files
> from the artifacts.
>
> Build tools should collect the artifacts that match the specified
> patterns, for example in a compressed tar archive, and make them
> available alongside the build log for inspection. The artifacts should
> usually be collected regardless of whether the build succeeds or fails.
>
> The Build-Artifacts field is not copied into the source package control
> file (dsc(5)), binary package control file (deb-control(5)),
> changes file (deb-changes(5)) or any other build results.
>
> (To prototype this without dpkg supporting it, X-Build-Artifacts would be
> appropriate, with none of the XS-, XB-, XC- prefixes.)
>
> For example, a package using Meson with recent debhelper versions would
> typically use:
>
> Build-Artifacts: obj-*/meson-logs
>
> or a package using recursive Automake might use:
>
> Build-Artifacts:
>  config.log
>  tests/*.log
>  tests/*.trs
>  tests/reftests/*.png
>
>

Re: Including build metadata in packages

2022-05-07 Thread Vagrant Cascadian
On 2022-02-14, Paul Wise wrote:
> On Sun, 2022-02-13 at 14:13 -0800, Vagrant Cascadian wrote:
>
>> * Split build metadata into a separate file or archive
>> 
>> Some of the debian-installer packages generate tarballs that are not
>> .deb files and are included in the .changes files when uploading to
>> the archive; making a similar generalized option for other packages to
>> put build metadata into a separate artifact might be workable approach,
>> although this would presumably require toolchain changes in dpkg and
>> dak at the very least, and might take a couple release cycles, which
>> is... well, debian.
>
> I already sent a mail like this in the past, but...
>
> https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=950585#30
>
> This approach is already in use in the archive, but not
> yet for the kind of artefacts that you are talking about:
>
> https://codesearch.debian.net/search?perpkg=1=dpkg-distaddfile
> https://salsa.debian.org/ftp-team/dak/raw/master/config/debian/dak.conf 
> (search for AutomaticByHandPackages)
> https://salsa.debian.org/ftp-team/dak/raw/master/daklib/upload.py (search for 
> byhand_files)
> https://salsa.debian.org/ftp-team/dak/tree/master/scripts/debian/
>
> I think this would not require anything except a new dak config stanza
> for AutomaticByHandPackage and potentially a patch to dak code or a
> script. Seems unlikely it would require changes to anything other than
> dak plus the packages that want to opt in to using it, so should be
> completely doable within the bookworm release cycle.

I took a peek at this approach finally, and it looks like binutils would
be relatively trivial to support, but things like gcc-11, gcc-N would
need regular intervention from ftpmasters (unless some sort of pattern
matching rules were added). At least, one more step for the NEW
processing to consider...


> If you want to have some way to automatically download the data, then
> something like apt-file and Contents files could be done, I expect that
> would also be able to be done for the bookworm release and also be
> possible to put in bullseye-backports.
>
> You could even include all the build logs and build info in the same
> data set, and potentially modify the package build process so that
> build logs for maintainer built binaries end up there too.
>
> Something like this would be my suggested archive structure:
>
> Release -> Builds-amd64 -> foo_amd64.build
>  \ \-> foo_amd64.buildinfo
>   \--> foo_amd64.buildmeta.tar.xz
>
> Or since the buildinfo files are similar to Packages/Sources stanzas:
>
> Release -> BuildLogs-amd64 -> foo_amd64.build.xz
>  \ \-> BuildInfo-amd64
>   \--> BuildMeta-amd64 -> foo_amd64.buildmeta.tar.xz

>
> This could be in the main archive, or a separate debian-builds archive.

For reproducible builds, this doesn't actually end up being very
different from just shipping a FOO-unreproducible.deb, as reproducible
builds tests would still have to exclude FOO_ARCH.buildmeta.tar.xz from
the comparisons anyways, if they end up being listed in the .changes
(and .buildinfo?) files.

Though, it does open the door to other possibilities. Putting all the
.buildinfo files in this buildmeta.tar.xz would finally get in-archive
distribution of .buildinfo files, which would be very nice...


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Status of Required/Essential/Build-Essential in Debian

2022-04-29 Thread Vagrant Cascadian
On 2022-04-28, Chris Lamb wrote:
>> Lately, I've been trying to get a handle on the status of the really
>> core packages in Debian, namely the essential, required and
>> build-essential package sets. The first two are present on nearly every
>> Debian system, and build-essential is the set of packages assumed to be
>> available whenever you build a package in Debian.
>
> Wow, this is an excellent summary of the status here — thanks! And
> indeed, an explicit & extra thank you for taking the time to write it
> up so it could be posted here — I made an analogous and scrappy list
> for myself in ~2018 to target these high-value packages, but I was
> now clearly in error in not taking the time to clean it up and share
> it... :(

The fact that the list has been for the most part shrinking is part of
what made me excited to take the time to write it up. :)


>> The more difficult issue with apt is caused by toolchain bugs in
>> doxygen:
>>
>> https://tests.reproducible-builds.org/debian/issues/nondeterminstic_todo_identifiers_in_documentation_generated_by_doxygen_issue.html
>>
>> https://tests.reproducible-builds.org/debian/issues/nondeterministic_ordering_in_documentation_generated_by_doxygen_issue.html
>
> For the sake of completeness (we talked about this issue elsewhere
> recently, Vagrant; this is for the benefit of this list/thread..),
> although there are only two notes linked immediately above, I suspect
> there are more than the two issues in doxygen, and the program could
> well warrant a sustained attack... especially as it would affect a
> *lot* of packages.

Yes, there are definitely more problems and would be lovely to fix them
all! For apt's specific case I *think* solving those two issues might
be sufficient for apt.


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Status of Required/Essential/Build-Essential in Debian

2022-04-27 Thread Vagrant Cascadian
Lately, I've been trying to get a handle on the status of the really
core packages in Debian, namely the essential, required and
build-essential package sets. The first two are present on nearly every
Debian system, and build-essential is the set of packages assumed to be
available whenever you build a package in Debian.

I will summarize below the outstanding issues for Debian with these
package sets.

I'd also be really curious to hear about the status of similar package
sets in other distros! I would also like to see if there is anything in
Debian or other distros that still needs to be pushed upstream, so we
can all benefit!


Essential:

  
https://tests.reproducible-builds.org/debian/unstable/amd64/pkg_set_essential.html

Almost done with essential, at 95% reproducible:

The only outlier is glibc, which currently doesn't build, but a version
that does build in debian experimental has a patch submitted specific to
Debian's packaging of glibc:

  different file permissions on ld.so.conf* and others
  https://bugs.debian.org/1010233


Required:

  
https://tests.reproducible-builds.org/debian/unstable/amd64/pkg_set_required.html

Also nearly there, at 88.9% reproducible (and one probably obsolete
package in the list, gcc-9):


apt has two remaining issues, one of which is trivial to fix:

 BuildId differences triggered by RPATH
 https://bugs.debian.org/1009796

The more difficult issue with apt is caused by toolchain bugs in
doxygen:

 
https://tests.reproducible-builds.org/debian/issues/nondeterminstic_todo_identifiers_in_documentation_generated_by_doxygen_issue.html
 
https://tests.reproducible-builds.org/debian/issues/nondeterministic_ordering_in_documentation_generated_by_doxygen_issue.html

There is a workaround patch for apt to disable building of documentation:

 support "nodoc" build profile
 https://bugs.debian.org/1009797


Build-Essential:

  
https://tests.reproducible-builds.org/debian/unstable/amd64/pkg_set_build-essential.html

Not bad at 87.1% reproducible.

linux has two issues, one unidentified issue relating to build paths,
and another documentation issue:

  
https://tests.reproducible-builds.org/debian/issues/randomness_in_documentation_generated_by_sphinx_issue.html


libzstd has one remaining issue, where it embeds build paths in assembly
objects:

  
https://tests.reproducible-builds.org/debian/issues/build_path_captured_in_assembly_objects_issue.html


gmp has one outstanding set of patches to fix build path issues:

  Embedded build paths in various files
  https://bugs.debian.org/1009931


binutils has several identified issues and probably some unidentified
issues:

  included log files introduce reproducibility issues (debian specific?)
  https://bugs.debian.org/950585
  
https://tests.reproducible-builds.org/debian/issues/unstable/test_suite_logs_issue.html

  source tarball embeds build user and group (debian specific)
  https://bugs.debian.org/1010238


krb5 has one really perplexing issue related to build paths triggering
seemingly unrelated changes in the documentation, possibly toolchain
related (sphinx? doxygen?):

  differing build paths trigger different documentation
  https://bugs.debian.org/1000837


gcc-12 (and probably other gcc variants) also embeds test suite logs
very similar to bintuils described above. Probably many other issues,
especially related to build-time profile-guided-optimization and... who
knows! GCC also takes so long to build, it can be difficult for our test
infrastructure to actually build and/or run diffoscope without timing
out...


openssl contains a few unidentified issues relating to build paths, some
test suite failures in our test infrastructure, and a couple known build
path related issues:

  
https://tests.reproducible-builds.org/debian/issues/build_path_captured_in_assembly_objects_issue.html

  Embeded compiler flags contain build paths
  https://bugs.debian.org/1009934


Build-Essential-Depends Bonus Round! (all the packages that
Build-Essential needs to build itself):

  
https://tests.reproducible-builds.org/debian/unstable/amd64/pkg_set_build-essential-depends.html

At 86.3% reproducible, it still doesn't look too bad, and there are a
lot of patches submitted and/or in progress. It is a much larger set of
packages, so I won't even try to summarize the status here.


S A few closing thoughts...

A fair number of these are build paths issues, which we do not test in
Debian testing (currently bookworm), only in debian unstable and
experimental. So the numbers in general look a better for
testing/bookworm. Other distros by-and-large do not test build paths
variations, and while I'd like to fix those issues, they're a little
lower-priority.

Two other remaining issues are toolchain issues for documentation using
sphinx and doxygen, and are the last blockers for fixing apt and linux
(as well as numerous other packages). This seems like a high priority to
fix!

I have been chewing on the ideas of how to resolve the 

Re: randomness_in_r_rdb_rds_databases not very random?

2022-04-18 Thread Vagrant Cascadian
On 2022-04-18, Chris Lamb wrote:
>> Because of this discrepancy, I created the issue
>> "captures_build_path_in_r_rdb_rds_databases" but I also noticed the
>> description for "randomness_in_r_rdb_rds_databases" mentions that it is
>> the result of a build path:
>>
>>   Randomness seems to come from using absolute paths in .rd[bs] files.
>
> I think a factor here is the autoclassification script that, as I
> understand it, does not differentiate between these two cases.
>
> Now, I did spend significant time on R a couple of months ago, only to
> realise that a proper fix (let alone an upstreamable one) is quite
> some work away. Which, more relevant to this conversation, meant that
> delineating the finer distinctions between R's many issues became less
> of a priority. (This is all background and/or explanation for the
> status quo, mind you; I would agree it's always better to have better
> distinctions.)

Makes sense to me, thanks!


>> I think most of the fixes mentioned in the description of
>> "randomness_in_r_rdb_rds_databases" have been applied.
>
> Interesting. I'm not quite as confident about this as you seem to be,
> but glad to be re-assured.

Oh, maybe I miscommunicated my confidence level then; there may still
be dragons! I only did very brief follow-up reading the referenced
issues in the description...


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


randomness_in_r_rdb_rds_databases not very random?

2022-04-12 Thread Vagrant Cascadian
I've noticed nearly all of the packages marked
"randomness_in_r_rdb_rds_databases" actaully appear to be not random at
all, but rather a deterministic result derived from the build path:

~1200 packages that FTBR in unstable:

  
https://tests.reproducible-builds.org/debian/issues/unstable/randomness_in_r_rdb_rds_databases_issue.html

~1200 build reproducibly in bookworm:

  
https://tests.reproducible-builds.org/debian/issues/bookworm/randomness_in_r_rdb_rds_databases_issue.html

Because of this discrepancy, I created the issue
"captures_build_path_in_r_rdb_rds_databases" but I also noticed the
description for "randomness_in_r_rdb_rds_databases" mentions that it is
the result of a build path:

  Randomness seems to come from using absolute paths in .rd[bs] files.

I think most of the fixes mentioned in the description of
"randomness_in_r_rdb_rds_databases" have been applied.


Should this be clarified somehow?

Our notes are generally a best effort thing and we don't want to get too
caught up in exacting details or semantic arguments, but it might be
good to be more specific when we can? :)


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Embedded buildpath via rpath using cmake

2022-04-03 Thread Vagrant Cascadian
On 2022-02-03, Vagrant Cascadian wrote:
> Over the last several months, I and others have found quite a few
> packages that embed build paths via rpath when building with cmake.  I
> found myself slowly edging into a mass bug filing, one bug report at a
> time...
>
> I ended up submitting a few patches and noting some affected packages:
>
>   
> https://tests.reproducible-builds.org/debian/issues/unstable/cmake_rpath_contains_build_path_issue.html
>
> There are almost certainly packages missing from that list, as it is
> generated by human confirmation... 

So in the last couple months I kept finding more packages affected by
this; the above URL now has confirmed 380+ packages affected by this
issue, a few of which are are now fixed, thanks!

On those I've tested and confirmed, I've either submitted a patch and/or
mentioned in comments for the package in the reproducible builds notes,
which you can see by clicking on the referenced package in the above
URL, or searching for the relevent package in:

  
https://salsa.debian.org/reproducible-builds/reproducible-notes/-/blob/master/packages.yml
  

I don't know for sure that this is a comprehensive list of affected
packages; I've mostly identified packages that fail to build with build
path variations and had otherwise no known cause, or used other
identified issues that apparently had a direct correlation to this
issue.

Doing a systematic search for all packages that use cmake to build and
fail to build reproducibly in unstable or experimental would probably be
the next step to identify any remaining packages...


> In many cases I've tested so far, passing an argument via a
> dh_auto_configure override in debian/rules fixes the issue:
>
>override_dh_auto_configure:
>dh_auto_configure -- -DCMAKE_BUILD_RPATH_USE_ORIGIN=ON
>
>
> Alternately, the experimental debhelper compat level v14 does include a
> fix for these embedded rpaths, though in the current state, passing both
> -DCMAKE_SKIP_RPATH=ON and -DCMAKE_RPATH_USE_ORIGIN=ON, it triggers build
> failures 263 packages, according to a test run by Lucas Nussbaum in
> October:
>
>   http://qa-logs.debian.net/2021/10/25/diff.dcsr.txt
>
>
> Since debhelper v14 is not finalized yet, I just sent a request to
> debhelper to only pass one of the arguments,
> -DCMAKE_RPATH_USE_ORIGIN=ON, which should significantly reduce the
> number of build failures while still making many packages reproducible
> with debhelper compat v14:
>
>   https://bugs.debian.org/1004939

Haven't gotten any comment on this from the debhelper maintainers yet...

There are a few where -DCMAKE_RPATH_USE_ORIGIN=ON does trigger test
failures or otherwise causes build failures (some had test suite
failures without changes), but my off the cuff guess is ~2% of the ~380
noted packges; less than I could count on both hands using very simple
methods. This should be significantly less that -DCMAKE_SKIP_RPATH=ON...


It seems like it is not possible to actually create something like a
lintian warning for this, as the actual build path is stripped out
before creating the .deb package; the only result is for the most part a
different build id and a few small changes in the binaries. Would, of
course, be happy to be proven wrong!


I've added a new list of affected maintainers produced with dd-list with
the packages marked with the "cmake_rpath_contains_build_path" issue
that haven't yet been fixed in some way according to
tests.reproducible-builds.org.


Thanks everyone!


live well,
  vagrant

"Adam C. Powell, IV" 
   oce (U)

A. Maitland Bottoms 
   airspyone-host
   codec2
   gr-fosphor
   gr-funcube (U)
   gr-hpsdr (U)
   gr-iqbal
   gr-osmosdr
   gr-radar
   gr-rds
   hackrf
   libfreesrp
   rtl-sdr
   volk

Adam Borowski 
   pmdk-convert
   pmemkv

Adrian Knoth 
   libdrumstick (U)

Alastair McKinstry 
   ecflow
   mathgl (U)

Alberto Garcia 
   cog

Alberto Luaces Fernández 
   openscenegraph

Alessio Treglia 
   fluidsynth (U)
   libdrumstick (U)

Alf Gaida 
   libqtxdg (U)
   lxqt-config (U)
   lxqt-globalkeys (U)
   nomacs (U)
   screengrab (U)

Andrea Capriotti 
   userbindmount (U)
   vdeplug4 (U)

Andreas Bombe 
   soapyosmo (U)
   soapysdr (U)

Andreas Cord-Landwehr 
   kdevelop-python (U)

Andreas Metzler 
   hugin (U)
   libpano13 (U)

Andreas Rönnquist 
   allegro5 (U)

Andreas Tille 
   bamtools (U)
   civetweb (U)
   libminc (U)
   openmm (U)
   prime-phylo (U)
   spoa (U)

Andrew Lee (李健秋) 
   libqtxdg (U)
   lxqt-config (U)
   lxqt-globalkeys (U)
   nomacs (U)
   screengrab (U)

Andrey Rahmatullin 
   librsync

Andrius Merkys 
   libemf2svg (U)
   macromoleculebuilder (U)
   openmm (U)

Antoine Beaupré 
   slop

Anton Gladky 
   libopenshot (U)
   liggghts (U)
   metis (U)
   tetgen (U)

Antonio Ospite 
   libam7xxx

Apollon Oikonomopoulos 
   leatherman (U)

APT Development Team 
   apt

Arne Bernin

Re: haveged: status change on tests.reproducible-builds.org/debian

2022-03-08 Thread Vagrant Cascadian
On 2022-03-09, Cyril Brulebois wrote:
> Reproducible Builds folks  
> (2022-03-09):
>> The reproducibility status of the package haveged changed during the
>> continuous testing.
>> See the following notes for more details:
>> 
>> 2022-03-08 01:09 
>> https://tests.reproducible-builds.org/debian/unstable/amd64/haveged changed 
>> from reproducible -> FTBFS
>> 
>> Feel free to reply to this email if you have questions regarding
>> this automatic notification.
>
> Let's do that: a pointer to the CI / build log would be helpful.

The URL mentioned above has links to the  logs in the sidebar at
the left.

It did succeed on the first build, but failed on the second build. It
seems there were some test suite failures:

  
https://tests.reproducible-builds.org/debian/logs/unstable/amd64/haveged_1.9.14-1.build2.log.gz

Maybe this is the error?

  Check Fail: chisqr:99.670454% not in 1.00-99.00


Thanks for taking an interest in reproducible builds!


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Including build metadata in packages

2022-02-13 Thread Vagrant Cascadian
A while ago I noticed binutils had some embedded logs in one of it's
packages, which included timing information about the test suite runs
which will almost certainly have differences between the different
builds, even on the exact same machine:

  https://bugs.debian.org/950585

My proposed patch removed the timing information and various other
things, but was exactly the information wanted from these files, so was
not an appropriate patch.


It also became known that other key toolchain packages (e.g. gcc) also
embed similar log files in the .deb packages... I have since found a few
other packages that do similar things:

  
https://tests.reproducible-builds.org/debian/issues/unstable/test_suite_logs_issue.html


Obviously, this would interfere with any meaningful reproducible builds
testing for any package that did something like this. Ideally metadata
like this about a build should *not* be included in the .deb files
themselves.


I'll try to summarize and detail a bit some of the proposed strategies
for resolving this issue:


* output plaintext data to the build log

Some of these log files are large (>13MB? per architecture, per package
build) and would greatly benefit from compression...

How large is too large for this approach to work?

Relatively simple to implement (at least for plain text logs), but
potentially stores a lot of data on the buildd infrastructure...


* Selectively filter out known unreproducible files

This adds complexity to the process of verification; you can't beat the
simplicty of comparing checksums on two .deb files.

With increased complexity comes increased opportunity for errors, as
well as maintenance overhead.

RPM packages, for example, embed signatures in the packages, and these
need to be excluded for comparison.

I vaguely recall at least one case where attempting something like this
in the past and resulting in packages incorrectly being reported as
reproducible when the filter was overly broad...

Some nasty corner cases probably lurk down this approach...


* Split build metadata into a separate .deb file

Some of the similar problems of the previous, though maybe a little
easier to get a reliable exclusion pattern? Wouldn't require huge
toolchain changes.

I would expect that such packages be not actually dependend on by any
other packages, and *only* contain build metadata. Maybe named
SOURCEPACKAGE-buildmetadata-unreproducible.deb ... or ?

Not beautiful or elegant, but maybe actually achievable for bookworm
release cycle?


* Split build metadata into a separate file or archive

Some of the debian-installer packages generate tarballs that are not
.deb files and are included in the .changes files when uploading to the
archive; making a similar generalized option for other packages to put
build metadata into a separate artifact might be workable approach,
although this would presumably require toolchain changes in dpkg and dak
at the very least, and might take a couple release cycles, which
is... well, debian.

The possibility of bundling up .buildinfo files into this metadata too,
while taking some changes in relevent dpkg, dak, etc. tooling, might in
the long term be worth exploring.

There was a relevent bug report in launchpad:

  https://bugs.launchpad.net/launchpad/+bug/1845159

This seems like the best long-term approach, but pretty much *only* a
long-term approach...


I'd really like to remove this hurdle to reproducible builds from some
key packages like binutils and gcc, but also curious about a
generalizable approach so each package needing something like this
doesn't reinvent the wheel in incompatible ways...


Curious to hear your thoughts!


live well,
  vagrant

p.s. please consider CCing me and/or
reproducible-bui...@lists.alioth.debian.org, as I'm not subscribed to
debian-devel.


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Embedded buildpath via rpath using cmake

2022-02-04 Thread Vagrant Cascadian
On 2022-02-04, Seth Arnold wrote:
> On Thu, Feb 03, 2022 at 04:41:21PM -0800, Vagrant Cascadian wrote:
>> Over the last several months, I and others have found quite a few
>> packages that embed build paths via rpath when building with cmake.  I
>> found myself slowly edging into a mass bug filing, one bug report at a
>> time...
>
> Hello Vagrant, does this represent a security problem?

Other than reproducible builds in general providing some security
properties, I would say not really.


> I tried to give this a look myself but didn't know what to look for; I
> grabbed a few recent versions of packages:
>
> http://ftp.debian.org/debian/pool/main/n/nfs-ganesha/nfs-ganesha_3.4-1_amd64.deb

One thing is checking the reproducible builds results:

  
https://tests.reproducible-builds.org/debian/rb-pkg/unstable/amd64/diffoscope-results/nfs-ganesha.html

Which appear to have reproducibility issues in the unstable tests, where
build paths are varied, but not in bookworm, where build paths are not
varied.

Unfortunately, the diffoscope output linked above does not obviously
show the build path embedded in the binaries (other than some .py files,
which may be a separate issue).

There are a few lines which are non-obvious, but are in my experience a
sign of different build paths:

  0x000a·(STRSZ)··2327·(bytes)
vs.
  0x000a·(STRSZ)··2329·(bytes)

My going theory is that the length of the build path is embedded in a
padded value, even though the build path itself is actually stripped,
perhaps via -ffile-prefix-map=BUILDPATH=. or similar.


> http://ftp.debian.org/debian/pool/main/v/vmemcache/libvmemcache0_0.8.1-4_amd64.deb
> http://ftp.debian.org/debian/pool/main/f/fontforge/fontforge_20201107~dfsg-4_amd64.deb
>
> $ find . -type f -exec eu-readelf -d {} \; 2>/dev/null | grep RUNPATH
>   RUNPATH   Library runpath: [/usr/lib/ganesha]
>   RUNPATH   Library runpath: [/usr/lib/ganesha]
>   RUNPATH   Library runpath: [/usr/lib/ganesha]
>   RUNPATH   Library runpath: [/usr/lib/ganesha]
>
> Am I on the wrong track?

Because it doesn't often leave obvious traces of the build path in the
binaries, it is a bit tricky to test simply by examining the binaries
directly... Instead, experimentation seems to be the best way.

I use reprotest for this, first running a build with reprotest without
the patch, confirming that it builds at all, and does not build
reproducibly. Then running reprotest with the patch applied to add
-DCMAKE_BUILD_RPATH_USE_ORIGIN=ON in debian/rules, and seeing if it
builds reproducibly.

From the source directory, with the build dependencies installed:

  reprotest --verbose --store-dir=$(mktemp -d $HOME/buildresults-XX) 
--vary=-all,+build_path -- null


This should normalize the build as much as possible so that the only
thing different between the two builds is the build path.

Then compare the resulting buildresults-*/*.out to see if the second
build produces significantly less diffoscope output...


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Embedded buildpath via rpath using cmake

2022-02-04 Thread Vagrant Cascadian
On 2022-02-04, Paul Wise wrote:
> Vagrant Cascadian wrote:
>
>> Over the last several months, I and others have found quite a few
>> packages that embed build paths via rpath when building with cmake.
>
> This seems like the sort of thing that will be an ongoing problem, so
> if it is detectable statically then a lintian warning might be good.

So far I have only figured out how to detect it by building packages and
checking if they're reproducible, but if someone can figure out how to
make it work from lintian, so much the better!

I believe there is a lintian check for build paths embedded in binaries,
at least, which will catch this and other issues, but maybe it could be
extended to check for this more explicitly...


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Bug#1004950: reprotest fails when using kk_KZ.RK1048 locale with java (locale not supported by java)

2022-02-04 Thread Vagrant Cascadian
On 2022-02-04, Fab Stz wrote:
> I have a CI job that runs reprotest. The project, at some time, makes calls to
> java. However this fails/crashes sometimes.
>
> By narrowing things down it always crashes when locale kk_KZ.RK1048 is in use.
>
> By searching further I discovered that kk_KZ.RK1048 locale/charset is not 
> supported by java [1]. Hence the crash:
>
> $ LC_ALL=kk_KZ.RK1048 java -version
> Error occurred during initialization of VM
> java.lang.IllegalArgumentException: Null charset name
> at java.nio.charset.Charset.lookup(java.base/Charset.java:455)
...
> $ LC_ALL=C java -version
> openjdk version "11.0.12" 2021-07-20
> OpenJDK Runtime Environment (build 11.0.12+7-post-Debian-2)
> OpenJDK 64-Bit Server VM (build 11.0.12+7-post-Debian-2, mixed mode, sharing)
>
>
> Expected behavior? That it works also with call to java? or maybe an option to
> disable this locale if java is used?
>
> [1]:
> https://docs.oracle.com/javase/8/docs/technotes/guides/intl/encoding.doc.html

Well, reprotest intentionally uses some obscure locales to find bugs
exactly like this...

That said, this points out two issues, and I'm considering making some
changes to locale handling for the "experiment" tests. Currently:

  loc = random.choice(['fr_CH.UTF-8', 'es_ES', 'ru_RU.CP1251', 'kk_KZ.RK1048', 
'zh_CN'])

I don't think selecting a locale at random is a good idea; this means
sometimes a build might succeed with reprotest and sometimes not,
depending on which locale happens to be randomly selected.

Secondly, I think by default reprotest should use a slightly less
obscure locale to test, and the same locale every time.

Adding a commandline flag to build with a specified locale would also be
good. Possibly another flag to do a series of builds, each with a
different locale (maybe from a list of very obscure locales).


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Embedded buildpath via rpath using cmake

2022-02-03 Thread Vagrant Cascadian
Over the last several months, I and others have found quite a few
packages that embed build paths via rpath when building with cmake.  I
found myself slowly edging into a mass bug filing, one bug report at a
time...

I ended up submitting a few patches and noting some affected packages:

  
https://tests.reproducible-builds.org/debian/issues/unstable/cmake_rpath_contains_build_path_issue.html

There are almost certainly packages missing from that list, as it is
generated by human confirmation... 


In many cases I've tested so far, passing an argument via a
dh_auto_configure override in debian/rules fixes the issue:

   override_dh_auto_configure:
   dh_auto_configure -- -DCMAKE_BUILD_RPATH_USE_ORIGIN=ON


Alternately, the experimental debhelper compat level v14 does include a
fix for these embedded rpaths, though in the current state, passing both
-DCMAKE_SKIP_RPATH=ON and -DCMAKE_RPATH_USE_ORIGIN=ON, it triggers build
failures 263 packages, according to a test run by Lucas Nussbaum in
October:

  http://qa-logs.debian.net/2021/10/25/diff.dcsr.txt


Since debhelper v14 is not finalized yet, I just sent a request to
debhelper to only pass one of the arguments,
-DCMAKE_RPATH_USE_ORIGIN=ON, which should significantly reduce the
number of build failures while still making many packages reproducible
with debhelper compat v14:

  https://bugs.debian.org/1004939


I've attached a list of the maintainers of affected packages produced
with dd-list, getting the list of packages from the above-mentioned
reproducible builds issue and diff.dcsr.txt from archive rebuild.

If you're on the list, would love if you could check if your package
still builds correctly when passing only
-DCMAKE_BUILD_RPATH_USE_ORIGIN=ON.  For a few of the packages, there are
already patches in the Debian bug tracking system waiting for you!


Thanks everyone!


live well,
  vagrant
"Adam C. Powell, IV" 
   oce (U)

A. Maitland Bottoms 
   gr-funcube (U)
   gr-gsm (U)
   gr-hpsdr (U)
   gr-iqbal
   gr-limesdr (U)
   gr-satellites (U)
   libad9361

Adam Borowski 
   pmemkv
   vmemcache

Aigars Mahinovs 
   dlt-daemon

Alastair McKinstry 
   libtool
   mathgl (U)

Alberto Luaces Fernández 
   openscenegraph

Alessio Di Mauro 
   yubico-piv-tool (U)

Alessio Treglia 
   leveldb (U)
   libsoxr (U)

Alexander GQ Gerasiov 
   croaring

Alexandre Marie 
   ufo-core (U)

Alf Gaida 
   juffed (U)
   screengrab (U)

Andrea Capriotti 
   userbindmount (U)
   vdeplug4 (U)

Andreas Beckmann 
   pocl (U)

Andreas Bombe 
   gr-limesdr (U)
   soapyosmo (U)
   soapysdr (U)

Andreas Rönnquist 
   allegro5 (U)

Andreas Tille 
   libbpp-seq (U)
   libbpp-seq-omics (U)
   liblemon (U)
   libminc (U)
   libvbz-hdf-plugin (U)
   libzeep (U)
   openmm (U)
   spoa (U)

Andrew Lee (李健秋) 
   screengrab (U)

Andrew Pollock 
   log4cplus

Andrey Rahmatullin 
   librsync

Andrius Merkys 
   openmm (U)
   openstructure (U)

Ansgar 
   dune-common (U)
   dune-geometry (U)
   dune-grid (U)
   dune-grid-glue (U)
   dune-uggrid (U)

Anthony Fok 
   fontforge (U)

Anton Gladky 
   alglib (U)
   benchmark (U)
   cctz (U)
   kim-api (U)
   libopenshot (U)
   liggghts (U)
   metis (U)
   tetgen (U)
   vtk9 (U)

Apollon Oikonomopoulos 
   leatherman (U)

Arne Bernin 
   libfreenect (U)

Aron Xu 
   fcitx (U)
   libgooglepinyin (U)

Aurelien Jarno 
   libftdi
   libftdi1

Aurélien COUDERC 
   analitza (U)
   artikulate (U)
   elisa-player (U)
   kdebugsettings (U)
   keditbookmarks (U)
   kget (U)
   libkeduvocdocument (U)
   okteta (U)

Ayatana Packagers 
   qmenumodel

Barak A. Pearlmutter 
   cppad (U)
   mlpack (U)

Bartosz Fenski 
   supertux (U)

Bas Couwenberg 
   geos (U)
   qgis (U)
   sfcgal (U)

Ben Burton 
   regina-normal

Benjamin Barenblat 
   abseil

Benjamin Drung 
   libsoxr (U)

Bjoern Ricks 
   grantlee5 (U)

Boian Bonev 
   gammu

Boris Pek 
   eiskaltdcpp

Boyuan Yang 
   cjson
   fcitx5 (U)
   fcitx5-gtk (U)
   fcitx5-qt (U)
   go-for-it
   libavif (U)
   libime (U)
   libxlsxwriter (U)
   qevercloud
   tidy-html5 (U)
   xcb-imdkit (U)
   zxing-cpp

Bret Curtis 
   recastnavigation (U)

Carlos Donizete Froes 
   surgescript (U)

CESNET 
   libyang (U)

ChangZhuo Chen (陳昌倬) 
   juffed (U)
   screengrab (U)

Chow Loong Jin 
   tinyxml2

Christoph Berg 
   gr-limesdr (U)
   gr-satellites (U)
   libcm256cc (U)

Christoph Junghans 
   votca-csg (U)
   votca-tools (U)

Christoph Martin 
   nfs-ganesha (U)

Connor Imes 
   powercap

Cristian Greco 
   poco (U)

Dain Nilsson 
   yubico-piv-tool (U)

Daniel Kahn Gillmor 
   fontforge (U)

Daniel Schepler 
   kpat (U)
   libkdegames (U)

David Bremner 
   ledger

David Lamparter 
   libyang

David Prévot 
   cmocka

Davide Viti 
   fontforge (U)

Debian Astro Team 
   purify
   sopt

Debian Authentication Maintainers 
   yubico-piv-tool

Debian Bridges Team 
   libcork
   libcorkipset

Debian Deep Learning Team 
   pthreadpool

Debian Deepin Packaging Team 
   libxlsxwriter (U)

Debian Fonts Task Force 
   fontforge

Re: dh-perl6 vs. dh-raku: reproducibility issues with vendor/precompiled

2022-01-20 Thread Vagrant Cascadian
On 2022-01-20, Chris Lamb wrote:
>>> I just noticed a reproducibility issue in a package that transitioned
>>> from dh-perl6 to dh-raku, and it introduced some reproducibility issues
>>> in the raku-tap-harness in precomp files, e.g.:
>>
>> I think this was already briefly discussed in #1002496
>
> Yes, indeed. As I mentioned in that bug, I initially thought they were
> accidentally-distributed temporary/build files; something that's
> actually quite common in Debian and comes up quite a lot when doing
> Reproducible Builds stuff.
>
> If I had realised they were the result of deliberate pre-compilation
> efforts, I would probably not have filed that bug. Or, rather: I
> wouldn't have done without a patch to fix the issue! In other words,
> sorry for the essentially unactionable bug, although it *is* serving as
> a useful place to dump information as we inch towards a solution.
>
> (I have included #1002496 on the CC of this thread, perhaps to avoid any
> potential duplication in the future.)
>
>>> But there aren't many [tagged] packages there (yay?), and the
>>> description is a bit terse suggesting that these files should not
>>> be shipped at all...
>>
>> Well …
>
> Oh, don't read into that description, Vagrant! That's likely my
> description based on my jejune understanding of the problem at the
> time (see above). Please feel free to update it — I have nothing
> against precompilation as a general rule.

Yes... of course, shortly after I sent the mail starting this thread I
found more information on this issue!

I've added links to the bug and wiki page describing perl6
precompilation files in our reproducible builds notes and will think
about how to better describe and/or even rename the issue. :)


>>> They appear to be hashed filenames, what goes into the hash that
>>> produces them (file path? timestamp? etc.), and could that be made
>>> reproducible?
>>
>> That would be nice indeed.
>>
>> I once experimented by comparing the "old"
>> precompiled-at-instalation-time and the precompiled-at-build-time
>> files on my laptop, and interesetingly they were the same. Or I
>> missed something. But yeah, rebuilding with reprepo shows that paths
>> are embedded which ist Not Good™.
>
> Thanks for confirming in reprepro. This is also confirmed by me at the
> end of #1002496. I haven't done any other investigating yet.

I presume "reprotest"? Which I've also used to confirm this issue with a
few packages.


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


dh-perl6 vs. dh-raku: reproducibility issues with vendor/precompiled

2022-01-18 Thread Vagrant Cascadian
Hi folks!

I just noticed a reproducibility issue in a package that transitioned
from dh-perl6 to dh-raku, and it introduced some reproducibility issues
in the raku-tap-harness in precomp files, e.g.:

  
/usr/lib/perl6/vendor/precomp/69C6BF38EFE2DAC4C04DDF24DDCABC2CDF55A623/19/19DE6D93D4B9F2400823F8E9A00C571E42ADF6C3

Before switching from dh-perl6 to dh-raku, raku-tap-harness seemed to
consistently build reproducibly, and when I reverted back to using
dh-perl6 ant tested locally, it built reproducibly.


We're tracking this issue in reproducible builds as:

  
https://tests.reproducible-builds.org/debian/issues/unstable/randomness_in_perl6_precompiled_libraries_issue.html

But there aren't many packages there (yay?), and the description is a
bit terse suggesting that these files should not be shipped at all...


Not knowing much about perl6 ... are the precompiled files needed in
installed packages?  Could they be generated at package install time
rather than package build time (like .pyc files for python in Debian)?
They appear to be hashed filenames, what goes into the hash that
produces them (file path? timestamp? etc.), and could that be made
reproducible?


There were also some embedded build paths in this build done recently:

  
https://tests.reproducible-builds.org/debian/rb-pkg/unstable/amd64/diffoscope-results/raku-tap-harness.html
  
 vs.
ntArrayB.../builntArrayF.../buil
d/1st/raku-tap-hd/2/raku-tap-har
arness-0.2.1ness-0.2.1/2nd..

Though I was unable to reproduce this locally...


Thanks!


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


systemd reproducibility on armhf

2021-12-13 Thread Vagrant Cascadian
I finally did a reprotest build of systemd on armhf to try and figure
out why it doesn't build reproducibly... but it built reproducibly...

My test did not test building with a 64-bit kernel (it was using a
32-bit kernel in both cases), whereas the tests.reproducible-builds.org
infrastructure systematically tests one build with 64-bit kernel and one
with a 32-bit kernel...

The build done with a 32-bit kernel includes a reference to
"arm_fadvise64_64", whereas the build with a 64-bit kernel does
not:

  
https://tests.reproducible-builds.org/debian/rb-pkg/bookworm/armhf/diffoscope-results/systemd.html

Does fadvise (posix_fadvise?) on 64-bit not need any special handling,
whereas on 32-bit needs a wrapper function of some kind?

Not sure exactly where this behavior comes from(quite possibly in a
dependent library and not systemd itself), but the build features should
be determined from the userland (armhf) that it is built in, and not
from which kernel is running.


Sorry I don't have more conclusive results, but at least this suggests a
direction to look into.


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Bug#1001202: about default locales LANGUAGE: set to C or unset

2021-12-06 Thread Vagrant Cascadian
On 2021-12-06, xiao sheng wen(肖盛文) wrote:
>   I suggestion set default locales LANGUAGE to C  or unset LANGUAGE env.
> At present, it set to en_US:en, this will cause some problems in no en env.
>
> The C LANGUAGE would be the default locales when set it.
> If no vary, not set LANG and LANGUAGE is also ok.

My understanding was that LANGUAGE, unlike LANG and LC_* is not valid as
"C", though I'm having a difficult time finding references for what are
valid settings for LANGUAGE.


live well,
  vagrant
  
> diff -u build.py atzlinux.build.py
> --- build.py2021-12-06 18:07:33.653809577 +0800
> +++ atzlinux.build.py   2021-12-06 18:12:17.407143396 +0800
> @@ -365,7 +365,7 @@
>  # list question.
>  def locales(ctx, build, vary):
>  if not vary:
> -return build.add_env('LANG', 'C.UTF-8').add_env('LANGUAGE',
> 'en_US:en')
> +return build.add_env('LANG', 'C.UTF-8').add_env('LANGUAGE', 'C')
>  else:
>  # if there is an issue with this being random, we could instead 
> select
> it
>  # based on a deterministic hash of the inputs


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Bug#999689: reprotest: autopkgtest missing dependency on diffoscope

2021-11-14 Thread Vagrant Cascadian
Control: tags 999689 pending

On 2021-11-14, Stefano Rivera wrote:
> Fixing #988964 broke the "not need_builddeps" autopkgtest:
> https://ci.debian.net/packages/r/reprotest/unstable/amd64/
>
> It has failed since 0.7.16.
>
> The easy fix is:
> diff --git a/debian/tests/control b/debian/tests/control
> index 8132bdc..e5a47b8 100644
> --- a/debian/tests/control
> +++ b/debian/tests/control
> @@ -1,5 +1,5 @@
>  Test-Command: debian/rules autopkgtest-pytest PYTEST_MARKEXPR="not 
> need_builddeps"
> -Depends: @, python3-pytest, faketime, locales-all, fakeroot
> +Depends: @, diffoscope, python3-pytest, faketime, locales-all, fakeroot
>  
>  Test-Command: debian/rules autopkgtest-pytest 
> PYTEST_MARKEXPR="need_builddeps"
>  Depends: @, @builddeps@, fakeroot


Pushed to git:

  
https://salsa.debian.org/reproducible-builds/reprotest/-/commit/3ed09e34943e49c757ff51796acd557e91547b04

Thanks!

live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Mapping Reproducibility Bug Reports to Commits

2021-11-14 Thread Vagrant Cascadian
On 2021-11-14, Muhammad Hassan wrote:
> I am a researcher at the University of Waterloo, conducting a project
> to study reproducibility issues in Debian packages.

Great to hear!


> The first step for me is to link each Reproducibility-related bug at
> this link:
> https://bugs.debian.org/cgi-bin/pkgreport.cgi?usertag=reproducible-bui...@lists.alioth.debian.org
> to the corresponding commit that fixed the bug.
>
> However, I am unable to find an explicit way of doing so programatically. 
> Please assist.

This is, unfortunately, a non-trivial task; programatically this
information is not directly exposed anywhere (to my knowledge).

Two approaches come to mind, mining the vcs history and parsing the bug
report mbox files...

Mining the Vcs history:

Most packages in debian are maintained in some sort of VCS, most of
which are git, and most of those are on salsa.debian.org, and most of
the time the VCS is updated in sync with the uploaded package...

I think your best bet is to parse the debian/control file to get the
Vcs-* fields and then parse the corresponding revision control systems
for commit logs which include lines like:

Closes: #NNN

(Closes: #NNN)

Closes: NNN

For some packages this will not be possible, but it would be good to get
the data of those that do have a corresponding commit and those that
don't.


Parsing the bug report logs:

Another approach is to download the mbox file from each bug report, some
of which include auto-generated messages from salsa.debian.org with the
specific commit where it is marked as pending:

  https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=983588#18


Though marked as pending and included in Debian are two different
things; depending on the nature of your study you may need to confirm
that it was actually uploaded!


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Comparing bullseyes to bullseyes

2021-10-26 Thread Vagrant Cascadian
So, we have some reasonable nubmers about actual reproducibility for
Debian bullseye from the "beta" tests; looks to be over 90%
reproducible, which is great considering it is compared against packages
people actually use in the real world!

The "beta" tests wil miss out on toolchain fixes introduced since the
last time the package was built, as it builds using the same toolchain,
whereas the other tests use the toolchains currently in bullseye.

  
https://tests.reproducible-builds.org/debian/bullseye/index_suite_amd64_stats.html
  ~96% reproducible
  29647+828+334+23 = 30832 packages tested (excluding 64 non-amd64 and 
non-arch-all)

  
https://beta.tests.reproducible-builds.org/debian/results/bullseye_full.amd64+all.png
  ~92% reproducible
  26403+1398+62+820 = 28683 packages tested

I did notice an oddity in that the tests have ~1.8k more packages being
tested than the "beta" tests.  Not sure how that ~1.8k packages would
skew the results, but even then, the numbers still probably look pretty
good.

I know the results from the "beta" tests do separate arch:amd64 builds
and arch:all builds and then combines the data somehow, where the other
tests do arch:amd64+arch:all builds in a single pass per source package;
could that be related to the difference in total packages tested?

Holger also mentioned there were 500+ packages in bullseye without
.buildinfo files; I presume "beta" can only test packages with a
.buildinfo file, but does it log "missing .buildinfo" in some way?

Any other thoughts how the two different systems might be counting
packages differently?


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: alpha.tests.reproducible-builds.org/debian

2021-09-17 Thread Vagrant Cascadian
On 2021-09-16, Frédéric Pierret wrote:
> Le 9/16/21 à 9:34 AM, Holger Levsen a écrit :
>> given that https://debian.notset.fr/rebuild/results/unstable.amd64.html is
>
> May I add also https://debian.notset.fr/rebuild/results/unstable.all.html for 
> the "all" arch.

Would it be plausible to get a composite view with both amd64+all ?


I'm also noticing a high number of binNMUs as not being reproducible,
even when they're reproducible on tests.r-b.o (e.g. bash, coreutils),
but almost no successfully reproduced binNMUs (only one i found at a
quick look was pcb).


>> really nice already and knowing that I wont have much time in the next two
>> weeks (and really wanting to show real results for Debian now...) it occurred
>> to me that we could point the dns entry for 
>> alpha.tests.reproducible-builds.org
>> to debian.notset.fr and make that server serve those page as
>> https://alpha.tests.reproducible-builds.org/debian and thus show some
>> results NOW.
>> 
>> What do you think? Frédéric (running notset.fr) liked the idea.
>
> Yes I confirm that :)

Also happy to see more exposure of this milestone of rebuilding packages
in the archive! :)

There were earlier attempts from NYU, if I recall, but maybe issues with
snapshot.debian.org interferred with making it reliable over time?


>> And maybe we should use preview.t.r-b.o instead of alpha.t.r-b.o?

Either sounds fine to me...

Or practice.tests.r-b.o ... which sort of has dual meaning in the sense
of "reproducible builds in practice vs. in theory" and also "we're
practicing this new view of reproducible builds"


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Bug#993339: Package: reprotest Debian Salsa reprotest fails due to different compile times

2021-08-30 Thread Vagrant Cascadian
On 2021-08-30, Christopher Talbot wrote:
> Debian Salsa seems to fail reprotest due to different compile times.
...
> One recent example is here:
> https://salsa.debian.org/DebianOnMobile-team/vvmd/-/pipelines/283265
>
> When looking at the debs with diffoscope, I get example output like
> below (it happens with several files, I only included one for brevity):
>
> -drwxr-xr-x   0 root (0) root (0)0 2021-08-30
> 22:22:16.00 ./
> │ │ │ -drwxr-xr-x   0 root (0) root (0)0 2021-
> 08-30 22:22:15.00 ./usr/
> │ │ │ -drwxr-xr-x   0 root (0) root (0)0 2021-
> 08-30 22:22:16.00 ./usr/bin/
> │ │ │ --rwxr-xr-x   0 root (0) root (0)   119928 2021-
> 08-30 22:22:16.00 ./usr/bin/vvmd
...
> 08-30 22:22:24.00 ./usr/
> │ │ │ +drwxr-xr-x   0 root (0) root (0)0 2021-
> 08-30 22:22:25.00 ./usr/bin/
> │ │ │ +-rwxr-xr-x   0 root (0) root (0)   119928 2021-
> 08-30 22:22:25.00 ./usr/bin/vvmd

> In osk-sdl, it seemed to be fixed by running it a day later.
> https://salsa.debian.org/DebianOnMobile-team/osk-sdl/-/jobs/1821885

Hrm. That sounds suspicious...

I'm unable to reproduce this locally with reprotest against a vvmd or
osk-sdl git checkout. Maybe this has something to do with how salsaci is
setting up the directory... ? Maybe it unpacks the .orig.tar.* in some
way that alters the timestamps?

But that *should* be fixed by dpkg's SOURCE_DATE_EPOCH handling when it
generates the .deb ... hrm.

Maybe when the job was first run, SOURCE_DATE_EPOCH was set to a date
later than the file unpack times (e.g. maybe due to timezone?)... and
thus wouldn't clamp the timestamps ... that might explain re-running the
job later suceeding.


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Changing the first build to a UTF-8 locale?

2021-08-30 Thread Vagrant Cascadian
On 2021-08-30, Holger Levsen wrote:
> On Mon, Aug 30, 2021 at 12:16:40PM +0300, Adrian Bunk wrote:
>> there are currently several reproducible-only build failures due to
>> export LANG="C" in the first build.
>> 
>> Would it be OK to set LANG to C.UTF-8 or en_US.UTF-8 in the first build 
>> instead?
>> 
>> Building under non-UTF-8 locales feels pretty pointless at this time, 
>> and the build failures this causes in reproducible are annoying.
>> 
>> The second build is already using a UTF-8 locale.
>> 
>> The Debian buildds are already using C.UTF-8 for years.
>
> this seems sensible to me and I have implemented this change now, thanks 
> for suggesting it.
>
> (Should some valid rejection comes up for this idea we can always revert it.)

Not a rejection per se, but we could get some of the benefits of both
approaches by only using LANG=C in the unstable and experimental, just
like done for build paths, leaving testing and stable with LANG=C.UTF-8
(or en_US.UTF-8).

It does occasionally find genuine bugs where something builds
successfully with or without UTF-8, and having diffoscope output to
compare that might be helpful on occasion.

C.UTF-8 also isn't in upstream glibc, so while supported in Debian (and
a *different* implementation in Fedora, as I understand it), maybe it
makes some sense to test (in some cases) a non-UTF-8 "C" locale.

Using en_US.UTF-8 to mean "default language" doesn't seem a great
alternative to me, even as a native speaker of ... well... en_US (not
*sure* if I "speak" UTF-8).

So, some more ideas on the subject, though not strongly opinionated
either. :)


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: archive wide rebuild with "-DCMAKE_SKIP_RPATH=ON"?

2021-08-17 Thread Vagrant Cascadian
On 2021-08-17, Mattia Rizzolo wrote:
> On Thu, Aug 12, 2021 at 03:35:05PM +0200, Niels Thykier wrote:
>> > On 2021-08-10 Holger Levsen > > public.gmane.org> wrote:
>> >> [...]
>> > 
>> > Hello,
>> > 
>> > Isn't CMAKE_SKIP_RPATH a rather strange choice, what are the expected
>> > benefits over CMAKE_SKIP_INSTALL_RPATH? There is potential for breaking
>> > testsuites. 
>> > 
>> > Doesn't this break packages that intentionally use rpath for private
>> > shared libraries?
>> > 
>> > cu Andreas
>> 
>> Hi Andreas,
>> 
>> The CMAKE_SKIP_RPATH was recommended in #962474
>> (https://bugs.debian.org/962474) to improve reproducibility by default.
>> The bug contains the rationale for that option and explains the
>> underlying issue (plus links to the upstream bug tracker where that
>> topic was also discussed).
>
>
> Indeed, -DCMAKE_SKIP_RPATH=ON for me broke things in src:lib2geom:
> https://salsa.debian.org/multimedia-team/lib2geom/-/commit/f8a4c06edbd64d4c77d69ef3aea93e978a7156e4
>
> this is just an example, and I expect plenty of breakages.
>
>
> May I recommend we do *not* enable this on the r-b builders, as I'm
> positive it would break quite a few things here and there.  Rather, it
> would be perfect for aws' comparative rebuilds.  Now, who is the
> contact for those these days?

When we did the dpkg fixfilepath testing last year, Lucas Nussbaum was
the one who handled actually running the tests. I recall having to
submit a pull request against collab-qa-tools:

  https://salsa.debian.org/lucas/collab-qa-tools/-/merge_requests/9

Is that still a thing?


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: merged /usr

2021-08-17 Thread Vagrant Cascadian
On 2021-08-17, Holger Levsen wrote:
> On Sun, Aug 15, 2021 at 12:16:39AM +0100, Simon McVittie wrote:
>> The failure mode we have sometimes seen is packages that were built in
>> a merged-/usr chroot not working on a non-merged-/usr system, although
>> that's detected by the reproducible-builds infrastructure and is already
>> considered to be a bug since buster (AIUI it's considered a non-RC bug
>> in buster and bullseye, as a result of things like #914897 mitigating it).
>
> FWIW i'm preparing a commit right now which will change the 
> reproducible-builds
> infrastructure in so far as:
>
> - bullseye will not be tested anymore for differences of building with or
>   without the usrmerge package installed (just like stretch and buster were
>   and are not).
> - bookworn and unstable will be tested for differences of building with or
>   without the usrmerge package installed.

Given:

 TC decision on "Merged /usr"
 https://bugs.debian.org/914897 

The short of it, as I read it, is that non-usrmerge systems will be
unsupported for bookworm, or did I misread that?


I would almost think it makes more sense to *not* test usrmerge for
bookworm, but continue to test it for bullseye and unstable (and
experimental) in the reproducible builds infrastructure.

This is my quick rationale for why I think that:

* bullseye has been doing usrmerge variations for it's entire
  development cycle, it seems odd to change now.

* Keeping unstable/sid with usrmerge variations is good for QA, as it
  does occasionally catch deeper issues.

* Not doing usrmerge variations for bookworm is consistent with the plan
  for the next release (though we should have usrmerge always enabled
  then, as opposed to only building with non-usrmerge). It is also
  similar to build paths (which are not tested in the "testing" suite),
  a lower bar for the "testing" suite, as it is a relatively easy thing
  to workaround for reproducibility.


But, I've only caught a small part of this thread, so maybe there's more
to it. :)


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Help with Mroonga to finally get all of MariaDB Server reproducible

2021-07-23 Thread Vagrant Cascadian
On 2021-07-23, Chris Lamb wrote:
> Otto, thank you for your enthusiasm towards getting this package
> reproducible. It's clear from re-reading the issue on the GitHub issue
> for Mroonga how much you want to solve this.
...
>> Pretty sure this is the zeroed-out build path length getting embedded
>> (e.g. /build/path/1 vs. /build/path/1/2 have a different number of
>> characters).
>
> Interesting theory. So to test that, I built the version using
> differing build paths of the *same* length, and there was
> unfortunately still a diff between the ha_mroonga.so files. However,
> the 'amount' of differences were significantly reduced compared to the
> version currently on tests.reproducible-builds.org. Here is what I
> got:
>
>   https://people.debian.org/~lamby/EiquoKi5.html

Ah, perhaps it is not zeroed out after all... :/


> Focusing on the varying-length zeroed-out build path for a moment: I'd
> dearly love diffoscope to be able to display that clearly. The current
> output does not communicate that effectively, assuming that is the
> problem.

Maybe zeroed-out is a bit overly specific or even a red herring... or
there are more complexities to the issue...


> That that end, Vagrant do you have a more info for exactly what tool
> is doing this? Any GCC (??) documentation that you can reference? And
> perhaps a tool or utility that can show precisely this? I'd like to at
> least get a reliable testcase that we can incorporate into diffoscope.

I don't have a good suggestion for diffoscope, really.

It is unfortunately difficult as the compiler flag -ffile-prefix-map
strips out the human-readable bits. Sometimes I've disabled it just to
be able to see where the build paths are embedded, but that obviously
does not help diffoscope much.


>> Once your package with the timestamp patches applied migrates to
>> "testing" where we don't test build paths, it should be reproducible.
>
> (Just to clarify that the timestamp patches are in the experimental
> branch and not the version of the package in unstable. They therefore
> will not automatically migrate.)

I did think about the complexities of explaining this particular nuance
and decided to leave it as a bit of an excersize for the reader... :)


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Help with Mroonga to finally get all of MariaDB Server reproducible

2021-07-22 Thread Vagrant Cascadian
On 2021-07-21, Otto Kekäläinen wrote:
> We still have one issue in MariaDB to solve to have the whole package
> fully reproducible:
> https://tests.reproducible-builds.org/debian/rb-pkg/experimental/amd64/diffoscope-results/mariadb-10.5.html
>
> Chris Lamb looked into this in Feb 2020[1]  and we added one patch[2]
> but it wasn't enough. Upstream developer is very responsive (replying
> within a day[3]) but unfortunately nobody has managed to track down
> where the DYNSTR thing comes from.

Pretty sure this is the zeroed-out build path length getting embedded
(e.g. /build/path/1 vs. /build/path/1/2 have a different number of
characters).

Once your package with the timestamp patches applied migrates to
"testing" where we don't test build paths, it should be reproducible.


We unfortunately don't have a good solution or workaround for this yet,
other than building in a consistant build path.


> [3] https://github.com/mroonga/mroonga/issues/298

I added slightly more detailed commentary to the issue.


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Bug#914128: perl: usrmerge issues

2021-07-15 Thread Vagrant Cascadian
On 2021-07-15, Vagrant Cascadian wrote:
> On 2018-11-19, Niko Tyni wrote:
>> Diffoscoping a perl built on a usrmerged [1] system with
>> one built on a non-usrmerged system reveals the configure
>> process hardcoding some paths in the build results,
>>
>> [1] https://wiki.debian.org/UsrMerge
>>
>> Snippets from config.h, Config.pm, Config_heavy.pl, config.sh.debug.gz
>> and so forth include things below.
...
>> -libpth => '/usr/local/lib /usr/lib/gcc/x86_64-linux-gnu/8/include-fixed 
>> /usr/include/x86_64-linux-gnu /usr/lib /lib/x86_64-linux-gnu /lib/../lib 
>> /usr/lib/x86_64-linux-gnu /usr/lib/../lib /lib',
>> +libpth => '/usr/local/lib /usr/lib/gcc/x86_64-linux-gnu/8/include-fixed 
>> /usr/include/x86_64-linux-gnu /usr/lib /lib/x86_64-linux-gnu /lib/../lib 
>> /usr/lib/x86_64-linux-gnu /usr/lib/../lib /lib /lib64 /usr/lib64',
...
>> -libspath=' /usr/local/lib /usr/lib/gcc/x86_64-linux-gnu/8/include-fixed 
>> /usr/include/x86_64-linux-gnu /usr/lib /lib/x86_64-linux-gnu /lib/../lib 
>> /usr/lib/x86_64-linux-gnu /usr/lib/../lib /lib'
>> +libspath=' /usr/local/lib /usr/lib/gcc/x86_64-linux-gnu/8/include-fixed 
>> /usr/include/x86_64-linux-gnu /usr/lib /lib/x86_64-linux-gnu /lib/../lib 
>> /usr/lib/x86_64-linux-gnu /usr/lib/../lib /lib /lib64 /usr/lib64'

Attached patch also fixes these issues, by adjusting libpth and libspath
in debian/config.over ... it feels a little hackish ... but ...

With all three patches applied, perl builds reproducibly!


live well,
  vagrant
From c7d24b8965eecbdfcebbf21900c744ee7b5842a4 Mon Sep 17 00:00:00 2001
From: Vagrant Cascadian 
Date: Thu, 15 Jul 2021 23:28:41 +
Subject: [PATCH 3/3] debian/config.over: Adjust libpth and libspath to build
 consistently when built on usrmerge or non-usrmerge system.

---
 debian/config.over | 21 +
 1 file changed, 21 insertions(+)

diff --git a/debian/config.over b/debian/config.over
index f793f48c8..29de4814c 100644
--- a/debian/config.over
+++ b/debian/config.over
@@ -45,6 +45,27 @@ if ! echo $libpth | grep -q "$multiarch_dir"
 then
 libpth="$libpth $multiarch_dir"
 fi
+if ! echo $libspath | grep -q "$multiarch_dir"
+then
+libspath="$libspath $multiarch_dir"
+fi
+multiarch_lib_dir=/lib/`dpkg-architecture -qDEB_HOST_MULTIARCH`
+if ! echo $libpth | grep -q " $multiarch_lib_dir"
+then
+libpth="$libpth $multiarch_lib_dir"
+fi
+if ! echo $libspath | grep -q " $multiarch_lib_dir"
+then
+libspath="$libspath $multiarch_lib_dir"
+fi
+if ! echo $libpth | grep -q ' /lib$'
+then
+libpth="$libpth /lib"
+fi
+if ! echo $libspath | grep -q ' /lib$'
+then
+libspath="$libspath /lib"
+fi
 
 # set configuration time to latest debian/changelog entry
 cf_time=$(LC_ALL=C date --utc -d "$(cd .. && dpkg-parsechangelog | sed -n -e 's/^Date: //p')")
-- 
2.32.0



signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Bug#914128: perl: usrmerge issues

2021-07-15 Thread Vagrant Cascadian
Control: tags 914128 patch

On 2018-11-19, Niko Tyni wrote:
> Diffoscoping a perl built on a usrmerged [1] system with
> one built on a non-usrmerged system reveals the configure
> process hardcoding some paths in the build results,
>
> [1] https://wiki.debian.org/UsrMerge
>
> Snippets from config.h, Config.pm, Config_heavy.pl, config.sh.debug.gz
> and so forth include things below.
>
> The /bin vs. /usr/bin command paths can probably be fixed/worked around
> by passing the full /bin paths (which should work on both systems)
> directly to Configure. The /lib64 thing in libpth / glibpth looks like
> a bug to me. I don't know what to do about libsdirs and libsfound.
>
> There's potential breakage if perl is built on a usrmerged system but
> run on a non-usrmerged one. I suspect the breakage would not be very bad
> and that most of this is cosmetic and not widely used.

Attached are two patches which *partially* address the issues, fixing
binary paths and glibpth.


> -libpth => '/usr/local/lib /usr/lib/gcc/x86_64-linux-gnu/8/include-fixed 
> /usr/include/x86_64-linux-gnu /usr/lib /lib/x86_64-linux-gnu /lib/../lib 
> /usr/lib/x86_64-linux-gnu /usr/lib/../lib /lib',
> +libpth => '/usr/local/lib /usr/lib/gcc/x86_64-linux-gnu/8/include-fixed 
> /usr/include/x86_64-linux-gnu /usr/lib /lib/x86_64-linux-gnu /lib/../lib 
> /usr/lib/x86_64-linux-gnu /usr/lib/../lib /lib /lib64 /usr/lib64',

Still an issue. I tried patching Configure (and the relevent
regen-configure files) to not de-duplicate directories, but without
success. How to patch Configure sure is "fun". :)


> -lns='/bin/ln -s'
> +lns='/usr/bin/ln -s'
>
> -rm_try='/bin/rm -f try try a.out .out try.[cho] try..o core core.try* 
> try.core*'
> +rm_try='/usr/bin/rm -f try try a.out .out try.[cho] try..o core core.try* 
> try.core*'

Fixed by attached patch, *mostly* using unspecified binary paths.
full_sed is intended to actually contain a full path, so I specified
/bin/sed explicitly (since it works on both usrmerge/non-usrmerge
systems).


> -glibpth='/usr/shlib  /lib /usr/lib /usr/lib/386 /lib/386 /usr/ccs/lib 
> /usr/ucblib /usr/local/lib '
> +glibpth='/usr/shlib  /lib /usr/lib /usr/lib/386 /lib/386 /usr/ccs/lib 
> /usr/ucblib /usr/local/lib /lib64 /usr/lib64 /usr/local/lib64 '

Fixed by attached patch. Debian mostly uses multiarch rather than
bi-arch, and the bi-arch directories were only added when /usr/lib64
exists anyways...


> -libsdirs=' /usr/lib/x86_64-linux-gnu'
> +libsdirs=' /lib/x86_64-linux-gnu'
...
> -libsfound=' /usr/lib/x86_64-linux-gnu/libgdbm.so 
> /usr/lib/x86_64-linux-gnu/libgdbm_compat.so 
> /usr/lib/x86_64-linux-gnu/libdb.so /usr/lib/x86_64-linux-gnu/libdl.so 
> /usr/lib/x86_64-linux-gnu/libm.so /usr/lib/x86_64-linux-gnu/libpthread.so 
> /usr/lib/x86_64-linux-gnu/libc.so /usr/lib/x86_64-linux-gnu/libcrypt.so'

No longer issues... ?


> -libspath=' /usr/local/lib /usr/lib/gcc/x86_64-linux-gnu/8/include-fixed 
> /usr/include/x86_64-linux-gnu /usr/lib /lib/x86_64-linux-gnu /lib/../lib 
> /usr/lib/x86_64-linux-gnu /usr/lib/../lib /lib'
> +libspath=' /usr/local/lib /usr/lib/gcc/x86_64-linux-gnu/8/include-fixed 
> /usr/include/x86_64-linux-gnu /usr/lib /lib/x86_64-linux-gnu /lib/../lib 
> /usr/lib/x86_64-linux-gnu /usr/lib/../lib /lib /lib64 /usr/lib64'

Still an issue. Probably inherited from libpth...


live well,
  vagrant
From 1a0d653ef6fdbaa136625e1251493a3d918e78f3 Mon Sep 17 00:00:00 2001
From: Vagrant Cascadian 
Date: Thu, 15 Jul 2021 20:20:07 +
Subject: [PATCH 1/2] Configure / libpth.U: Do not adjust glibpth when
 /usr/lib64 is present.

This results in differing values when built on a usrmerge system.
---
 Configure   | 1 -
 regen-configure/U/modified/libpth.U | 1 -
 2 files changed, 2 deletions(-)

diff --git a/Configure b/Configure
index 952d09990..ade58f915 100755
--- a/Configure
+++ b/Configure
@@ -1462,7 +1462,6 @@ glibpth="/lib /usr/lib $xlibpth"
 glibpth="$glibpth /usr/ccs/lib /usr/ucblib /usr/local/lib"
 test -f /usr/shlib/libc.so && glibpth="/usr/shlib $glibpth"
 test -f /shlib/libc.so && glibpth="/shlib $glibpth"
-test -d /usr/lib64 && glibpth="$glibpth /lib64 /usr/lib64 /usr/local/lib64"
 
 : Private path used by Configure to find libraries.  Its value
 : is prepended to libpth. This variable takes care of special
diff --git a/regen-configure/U/modified/libpth.U b/regen-configure/U/modified/libpth.U
index ba7126df4..d42928078 100644
--- a/regen-configure/U/modified/libpth.U
+++ b/regen-configure/U/modified/libpth.U
@@ -83,7 +83,6 @@
 ?X:	/usr/shlib is for OSF/1 systems.
 ?INIT:test -f /usr/shlib/libc.so && glibpth="/usr/shlib $glibpth"
 ?INIT:test -f /shlib/libc.so && glibpth="/shlib $g

Bug#988964: please demote diffoscope to Recommends

2021-07-07 Thread Vagrant Cascadian
Control: tags 988964 +patch

On 2021-05-25, Mattia Rizzolo wrote:
> On Fri, May 21, 2021 at 05:33:42PM -0700, Vagrant Cascadian wrote:
>> > Yes, just confirmed that it gets added through python3:Depends.
>> >
>> > So, I presume it will require mangling python3:Depends, or adjusting the
>> > code to convince the pybuild/dh_python/etc that it doesn't belong there.
>> 
>> diffoscope is in install_requires, removing it gets it out of
>> python3:Depends, but may have other unintended consequences:
>
> Yes, currently diffoscope is run unconditionally unless --no-diffoscope
> is passed.  which means that it need to learn to check before running it
> :)

How about this not really at all tested patch:

diff --git a/reprotest/__init__.py b/reprotest/__init__.py
index 6fd159a..f8ba450 100644
--- a/reprotest/__init__.py
+++ b/reprotest/__init__.py
@@ -824,8 +824,11 @@ def run(argv, dry_run=None):
 diffoscope = parsed_args.diffoscope
 if parsed_args.no_diffoscope:
 diffoscope_args = None
-else:
+elif shutil.which('diffoscope'):
 diffoscope_args = [diffoscope] + diffoscope_args
+else:
+logger.warning("diffoscope not available, falling back to regular 
diff")
+diffoscope_args = None
 control_build = parsed_args.control_build

 if not artifact_pattern:


Will try to test it soon...


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: [Request for help] Making brian reproducible

2021-06-08 Thread Vagrant Cascadian
On 2021-06-08, Nilesh Patra wrote:
> On 6/8/21 3:40 AM, Vagrant Cascadian wrote:
>> On 2021-06-08, Nilesh Patra wrote:
>>> I was trying to make "brian" package reproducible. To my understanding it 
>>> has two problems:

>>> * Only _some_ files in the documentation it vendors has stuff (like tags, 
>>> examples, links) in random order across different builds.
...
> As far as I tweaked around, it looks like either an issue with how the docs 
> are generated via code, more specifically via 
> brian2/sphinxext/generate_examples.py script
> or it is a central problem with sphinx docs itself.
> Admittely, I did not find anything unusual with the code anywhere, but for 
> sure, I _might've_ overlooked something important there.

I'll try to take a look at that specifically, then.


> Yeah, I've always run it with this option applied. The exact command I'm 
> using is:
>
> $ sudo reprotest --vary=-build_path,domain_host.use_sudo=1 --auto-build 
> ../brian_2.4.2-6.dsc -- schroot unstable-amd64-sbuild
>
> But this doesn't give any sensible hints (atleast to me, it doesn't look very 
> useful) - its almost the same as in salsa CI reprotest logs.

Yeah, sounds like this is a totally nondeterministic issue, then.


> If you have some free cycles, would you mind taking in a look?

Will try, though my guess is the problem is somewhere in sphinx...


> And if we find out that this is due to some problem with sphinx
> itself, do you think it is worthwhile to file a bug report with the
> SOURCE_DATE_EPOCH thingy fixed? That'd be a partial patch though.

I prefer submitting individual targeted patches even if they don't fix
all reproducibility issues; it results in a smaller reproducibility
diff, which is less noise (and possibly diffoscope runtime) for someone
to fix the remaining issues!

I also prefer one bug/patch per issue, it is clear what is fixed, and if
some other reproducibility issue shows itself, it can be fixed
separately without ending up with reproducible.patch,
reproducible2.patch ... reproducibleN.patch.


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: [Request for help] Making brian reproducible

2021-06-07 Thread Vagrant Cascadian
On 2021-06-08, Nilesh Patra wrote:
> I was trying to make "brian" package reproducible. To my understanding it has 
> two problems:
>
> * use datetime.date.today() and similar stuff for build documentation - I 
> suppose I fixed these with using SOURCE_DATE_EPOCH

Your fixes look reasonable; just be sure the way you're passing the time
is independent of timezone (it might be fine as is).


> * Only _some_ files in the documentation it vendors has stuff (like tags, 
> examples, links) in random order across different builds.
> By the looks of it, it seems this randomness is due to the way data is being 
> inserted into files with the brian2/sphinxext/generate_examples.py script,
> but I am having trouble figuring it out beyond this point.

Wild hunch, can you trying forcing the locale in debian/rules...

  export LANG=C.UTF-8 LC_ALL=C.UTF-8


There are a long list of issues related to sphinx, none of which look
exactly like what you have based on the descriptions, but might be close
enough to be worth tagging your package with 
randomness_in_documentation_generated_by_sphinx:

  
https://salsa.debian.org/reproducible-builds/reproducible-notes/-/blob/b0211037be80220ff0941475846840e8dc796fbf/issues.yml#L470

Or maybe one of the others.


I think sphinx is one of the documentation generation toolchains that if
we fixed some reproducibility issues in, it would fix quite a few of the
remaining unreproducible packages!


> I'd really appreciate any help here.

Try running reprotest with the --auto-build flag, which tries a build
varying only one thing at a time. It performs two normalized builds, and
if they are reproducible, it can usually identify what variations
trigger the problem... on a good day. :)


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Bug#988964: please demote diffoscope to Recommends

2021-05-21 Thread Vagrant Cascadian
On 2021-05-21, John Scott wrote:
> On my system, reprotest has the following Depends/Recommends:
> Depends: diffoscope (>= 112~), python3-distro, python3-rstr, python3:any, 
> python3-debian, apt-utils, libdpkg-perl, procps, python3-pkg-resources
> Recommends: disorderfs, faketime, locales-all, sudo
>
> Reprotest should really recommend Diffoscope so that users don't need
> to install it whom only want to check if packages are reproducible;
> this is what the --no-diffoscope argument is for.
>
> I would send a Merge Request, but I frankly can't figure out where this
> comes from. The applicable section in debian/control says
> Depends: ${python3:Depends},
>  python3-debian,
>  apt-utils,
>  libdpkg-perl,
>  procps,
>  python3-pkg-resources,
>  python3-rstr,
>  ${misc:Depends}
> Recommends:
>  diffoscope (>= 112~),
>  disorderfs,
>  faketime,
>  locales-all,
>  sudo,
>
> so my only guess is that Diffoscope gets pulled into
> ${python3:Depends}?

Yes, just confirmed that it gets added through python3:Depends.

So, I presume it will require mangling python3:Depends, or adjusting the
code to convince the pybuild/dh_python/etc that it doesn't belong there.

live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


prime-phylo build path issues in debug symbols

2021-05-20 Thread Vagrant Cascadian
On 2021-05-20, Nilesh Patra wrote:
> On triggering the salsa pipeline for prime-phylo on salsa, the
> reprotest job fails[1]
> There is a difference in the debug symbols .debs
>
> On taking a look at both debdiff and diffoscope, it seems (to me) that it is a
> buildpath issue, and I checked if -ffile-prefix-map is missing, and it
> wasn't really missing.
> It's hard for me to parse the diffocope logs, and I'm unsure where to
> start debugging this and hence need help there.
>
> The diffoscope output can be found here: https://paste.gnome.org/p9vm7cpyu
> Any help is welcome. Please consider to take a look.

It's already been flagged with build_id_differences_only:

  
https://tests.reproducible-builds.org/debian/rb-pkg/unstable/amd64/prime-phylo.html

It appears to build reproducibly in our bullseye tests, where we don't
vary the build path, so most likely you are correct; something
build path related.


This most likely needs to be fixed at the toolchain level and there may
not be much you can do in this package.

live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: [rt.debian.org #8547] Re: Debian RT: snapshot mirror

2021-04-14 Thread Vagrant Cascadian
On 2021-03-31, Holger Levsen wrote:
> On Sat, Mar 06, 2021 at 11:31:51AM +0100, Frédéric Pierret wrote:
> I'm not sure if you are aware of https://github.com/fepitre/snapshot-mirror
> which is Frédéric's project to create a partial snaphot.d.o mirror so that
> we can continue our work on reproducing Debian bullseye arch:and64 and 
> arch:all
> (these are the partial aspects of said mirror: only useful for reproducing
> bullseye amd64/all).
>
> and while I'm not sure how much Frédéric still needs those raw postgresql
> tables (on which he'll surely respond)... I'd like to add another request:
>
> would it be possible to whitelist his IP 92.188.110.153 so it doesn't get
> throttled when accessing snapshot.d.o?
>
> Currently his work is available at http(s)://debian.notset.fr/snapshot
> and goes back until November 2020. We plan to go back until the oldest 
> package built for bullseye... and then, we plan to add another mirror
> of that data which will have better ressources than Frédéric's mirror @home.

I've heard it now has data as far back as June 2020 now, but the main
limiting factor is the throttling that snapshot.debian.org does. It
looks like we may need to mirror as far back as 2017 to actually get
proper coverage to be able to rebuild all of bullseye... so, at this
rate, it may take a year to get a partial mirror working.

Getting a (partial) mirror of snapshot.debian.org live would reduce the
overall burden on snapshot.debian.org, which is presumably why there is
throttling at all, so seems like an all-around-win.

I'm guessing it's just a time limitation, and an acknowledgement at
least would be much appreciated, but if there are other blockers we
would also love to figure out a way to help!


Looking forward to rebuilding Debian all over again!


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Quick look at Reproducible Builds progress in Debian

2021-04-09 Thread Vagrant Cascadian
On 2021-03-06, Vagrant Cascadian wrote:
> On 2021-02-08, Vagrant Cascadian wrote:
>> On 2021-01-06, Vagrant Cascadian wrote:
>>> On 2020-12-07, Vagrant Cascadian wrote:
>>>> The percentage of packages that are reproducible admittedly doesn't look
>>>> amazing, even though there has been steady progress:
>>>>
>>>>   stretch  93.8%
>>>>   buster   94.1%
>>>>   bullseye 94.9%
>>>>   unstable 83.7%
>>>
>>>  bullseye 95.2% (+0.3%)
>>>  unstable 84.2% (+0.5%)
>>
>> bullseye 95.5% (+0.3%)
>> unstable 84.7% (+0.5%)
>
> bullseye 95.5% (+0.0%)
> unstable 84.9% (+0.2%)

bullseye 95.4% (-0.1%) 
unstable 84.8% (-0.1%)

>>>> Yet a look at the overall numbers of packages that are reproducible by
>>>> release:
>>>>
>>>>   stretch  23204
>>>>   buster   26740
>>>>   bullseye 28560
>>>>   unstable 26456
>>>
>>>   bullseye 28929 (+369)
>>>   unstable 26684 (+228)
>>
>> bullseye 29412 (+483)
>> unstable 27063 (+379)
>
> bullseye 29630 (+218)
> unstable 27162 (+99)

bullseye 29560 (-70)
unstable 27153 (-9)


Lost a little ground this month, but nothing astounding either way.


>>>> For reference, I'm just looking at and visually comparing:
>>>>
>>>>   
>>>> https://tests.reproducible-builds.org/debian/bullseye/index_suite_amd64_stats.html
>>>>   
>>>> https://tests.reproducible-builds.org/debian/unstable/index_suite_amd64_stats.html


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Quick look at Reproducible Builds progress in Debian

2021-03-06 Thread Vagrant Cascadian
On 2021-02-08, Vagrant Cascadian wrote:
> On 2021-01-06, Vagrant Cascadian wrote:
>> On 2020-12-07, Vagrant Cascadian wrote:
>>> The percentage of packages that are reproducible admittedly doesn't look
>>> amazing, even though there has been steady progress:
>>>
>>>   stretch  93.8%
>>>   buster   94.1%
>>>   bullseye 94.9%
>>>   unstable 83.7%
>>
>>  bullseye 95.2% (+0.3%)
>>  unstable 84.2% (+0.5%)
>
> bullseye 95.5% (+0.3%)
> unstable 84.7% (+0.5%)

bullseye 95.5% (+0.0%)
unstable 84.9% (+0.2%)

>>> Yet a look at the overall numbers of packages that are reproducible by
>>> release:
>>>
>>>   stretch  23204
>>>   buster   26740
>>>   bullseye 28560
>>>   unstable 26456
>>
>>   bullseye 28929 (+369)
>>   unstable 26684 (+228)
>
> bullseye 29412 (+483)
> unstable 27063 (+379)

bullseye 29630 (+218)
unstable 27162 (+99)

Pretty modest improvement this month, but still progress.

I'm a little confused in how the numbers/percentages work out (e.g. +99
packages in unstable shouldn't bump +0.2%) but ... this is just a
rough measure after all.


>>> For reference, I'm just looking at and visually comparing:
>>>
>>>   
>>> https://tests.reproducible-builds.org/debian/bullseye/index_suite_amd64_stats.html
>>>   
>>> https://tests.reproducible-builds.org/debian/unstable/index_suite_amd64_stats.html


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: partial snapshot mirror amd64/bullseye/bookworm

2021-02-28 Thread Vagrant Cascadian
On 2021-02-27, Holger Levsen wrote:
> snapshot.debian.org is an awesome service for the wider free software 
> community
> and especially for those working on reproducible builds. Sadly accessing 
> *many*
> packages from it is limited and troublesome (see below for bug numbers), thus
> we (mostly Frédéric Pierret and myself) came up with the idea of setting up a
> partial mirror, covering only the years 2017 until now and arch:amd64 and 
> arch:all
> only as well. (for a start, maybe we need 2015+2016 too and maybe we can 
> afford
> to also host arm64 or some other architecture...)
...
> #977653 Please document rate limits on snapshots.debian.org
> #960304 snapshot.debian.org: Snapshot repo repeatedly cutting off connection, 
> returning partial content
> #969906 snapshot.debian.org: error 500 internal server error after some 
> requests via Python

I'd like to point out that these issues are basically the last known
major blockers to performing verification builds against packages
actually in the Debian archive (as opposed to the builds done on
tests.reproducible-builds.org, which performs two builds against the
current state of archive).

Implementing a workaround for the limitations of snapshot.debian.org
could move Debian beyond Reproducible Builds in *theory* and into
Reproducible Builds in *practice*!

Happy to help with this, hope to see it move forward!


To further limit the scope, one might also be able to take all the
currently relevent .buildinfo files for bullseye and build a list of the
package versions needed from that (rather than assuming 2017 as a good
baseline). Might be a little trickier to implement, of course...


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds


Re: Quick look at Reproducible Builds progress in Debian

2021-02-08 Thread Vagrant Cascadian
On 2021-01-06, Vagrant Cascadian wrote:
> On 2020-12-07, Vagrant Cascadian wrote:
>> I just wanted to spend a few moments looking over the progress
>> Reproducible Builds has made in Debian over the last few release cycles.
>
> We've had some good progress since last month!

And another ~month of solid progress again!

>> The percentage of packages that are reproducible admittedly doesn't look
>> amazing, even though there has been steady progress:
>>
>>   stretch  93.8%
>>   buster   94.1%
>>   bullseye 94.9%
>>   unstable 83.7%
>
>  bullseye 95.2% (+0.3%)
>  unstable 84.2% (+0.5%)

bullseye 95.5% (+0.3%)
unstable 84.7% (+0.5%)

>> Yet a look at the overall numbers of packages that are reproducible by
>> release:
>>
>>   stretch  23204
>>   buster   26740
>>   bullseye 28560
>>   unstable 26456
>
>   bullseye 28929 (+369)
>   unstable 26684 (+228)

bullseye 29412 (+483)
unstable 27063 (+379)

>> For reference, I'm just looking at and visually comparing:
>>
>>   
>> https://tests.reproducible-builds.org/debian/stretch/index_suite_amd64_stats.html
>>   
>> https://tests.reproducible-builds.org/debian/buster/index_suite_amd64_stats.html
>>   
>> https://tests.reproducible-builds.org/debian/bullseye/index_suite_amd64_stats.html
>>   
>> https://tests.reproducible-builds.org/debian/unstable/index_suite_amd64_stats.html


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds

Re: Updating dpkg-buildflags to enable reproducible=+fixfilepath by default

2021-01-09 Thread Vagrant Cascadian
On 2021-01-09, Lisandro Damián Nicanor Pérez Meyer wrote:
> Note: in case we do not agree on this topic this will be the text I'll
> send to the
> tech-ctte.

Thanks for taking the time to draft some text. If we can come closer to
agreement on the proposed text, that would probably take a bit of the
load off of the tech-ctte. Hopefully some of my comments move in that
direction, although I also have added some counterpoint text as well...


> Let me start by noting that I have nothing against reproducibility. In fact
> quite the opposite: I love the idea... as long as it's properly implemented.

It is even "recommended" in Debian Policy policy to build reproducibly
with different build paths.


> The problem here is that __FILE__ is a public, well defined API with very 
> valid
> use cases (more on this below), even if the current implementation of all the
> major compilers go against reproducibility. 

At least two major compilers, GCC and Clang provide the
-ffile-prefix-map feature to do exactly that(which is what dpkg is
enabling), so it seems a bit overstated to say that all major compilers
"go against" reproducibility. They do not enable reproducibility by
default.


> So if we want to mangle __FILE__
> (thus breaking API) this should be done by an opt-in, and **never** by opting
> out. Else we risk breaking a valid implementation, be it already on the 
> archive
> any new package added afterwards.

While I understand that may be how you feel, it would be appreciated if
we could use something a little less loaded than "mangle". perhaps:

  So if we want to enable features using __FILE__ in a way which
  arguably breaks API


> Even more: we library maintainers continuously ask our upstreams to keep their
> API as stable as possible, and if not possible following some rules
> (SONAME change, for example). I will present an option to do the same 
> ourselves
> without breaking API/ABI.
>
> # __FILE__ is a public, well defined API
>
> During the course of #876901 many reasons were used to both say that __FILE__
> is or not a well defined API. In fact one of the evidences used where the
> compiler's documentation. For example
> https://gcc.gnu.org/onlinedocs/cpp/Standard-Predefined-Macros.html
> (enphasis mine):
>
>   This macro expands to the name of the current input file, in the form of a C
>   string constant. **This is the path by which the preprocessor opened
> the file**,
>   not the short name specified in ‘#include’ or as the input file name 
> argument.
>   For example, "/usr/local/include/myheader.h" is a possible expansion of this
>   macro.
>
> This definition says that it's up to the preprocessor to define the path. So,
> what has been the behaviour of all major compilers during all these years? 
> Using
> the full path. The proof of this is Qt itself. Check
> https://sources.debian.org/src/qtbase-opensource-src/5.15.2+dfsg-2/src/testlib/qtestcase.h/#L216
>
> This code has been working on **every** platform Qt works without any change.
> Qt is compiled in a myriad of OSs, using a myriad of compilers. They all do
> the exact same thing. So developers depend on a very stable definition
> of an API.
>
> And we all know that breaking API is bad, except if done carefully (read 
> below).
>
> # Doing the right thing
>
> This is just an idea of "doing the right thing", like bumping SONAME
> on a library.
> It definitely doesn't have to be the only one.

I understand your position is fundamentally about "Doing the right
thing" but I would say that we are all trying to do the right thing.


> I understand that the reproducibility people do not want to consider
> providing the
> same build path. While this is arguable I do understand that
> reproducibility without
> depending on the build path is a good thing. So I've tried to come up with a
> path to achieve this.

> ## New macro and warning (if they do not exist already)
>
> This would be the first step. The idea is to provide a new macro that,
> by definition
> and documentation, it can be mangled with the help of the build
> system, much as you
> are currently doing with __FILE__ now.
>
> The second step in this is to make compilers create a reproducibility warning 
> if
> some code uses __FILE__. This will have three effects effects:

We haven't proposed an alternate macro, which would surely take years at
best, possibly decades. That doesn't seem too realistic.


> - discouraging it's use
> - creating awareness on reproducibility.

We've discouraged the use of __FILE__ for years, have done plenty of
outreach on the subject of reproducibility, and gotten traction with
various upstream projects.


> - making other distros jump into reproducibility in a much easier way.

Arguably some distros (e.g. archlinux) are passing us by when it comes
to real-world reproducibility; I'm not sure Debian is the example by
which all other distros should be measured anymore. Which is good in
some ways, but somewhat disappointing to see Debian start something

Re: Updating dpkg-buildflags to enable reproducible=+fixfilepath by default

2021-01-09 Thread Vagrant Cascadian
On 2021-01-09, Lisandro Damián Nicanor Pérez Meyer wrote:
> Oh, I have sadly forgotten to mention another thing.
>
> On Sat, 9 Jan 2021 at 15:53, Lisandro Damián Nicanor Pérez Meyer
>  wrote:
>> # __FILE__ is a public, well defined API
>
> According to:
> Adrian Bunks mentions it in
> https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=876901#10
> Holger Levsen in https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=876901#42
> Mattia Rizzolo on https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=876901#192
>
> The ability of gcc to change __FILE__ was a patch that, at the time of
> those writings, wasn't yet accepted.

That is no longer the case. The fixfilepath feature enabled in dpkg only
uses features (e.g. -ffile-prefix-map) available in both upstream GCC
(>=9? or 8? ~2018) and Clang (>= 10), possibly other compilers as
well. There are no special patches to toolchains needed to enable this
feature.


> Even more, Ximion Luo wrote on
> https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=876901#212
>
>   The GCC patch (neither the previous nor the planned version) does
>   not change the default behaviour of __FILE__, and was never intended
>   to. Instead, it gives users the ability to rewrite __FILE__, more
>   specifically a prefix of it.
>
> makes it clear that the default behaviour is not changed. So if the
> patch got accepted (did it get accepted?) it was only to allow
> reproducible people to break API in order to get reproducibility.

Since then an alternate implementation was implemented that the
reproducible=+fixfilepath feature in dpkg takes advantage of, in order
to implement this feature in another distribution, if I recall
correctly.

It didn't address all the cases of the old GCC patches that Ximin
submitted, but the Reproducible Builds team decided in 2018 to make use
of upstream supported features only and dropped all of our patches to
GCC.

The notable difference is that it not longer makes use of an environment
variable; it requires passing the -ffile-prefix-map option to the
compiler. The dpkg patch simply adds this to the default relevent *FLAGS
variables.

(For historical completeness, though somewhat an aside to the topic at
hand, the -ffile-prefix-map approach does not address all the cases of
the former patches to GCC as the compiler flags sometimes end up in
various shipped artifacts in *some* cases, though certainly not all.)


> This in itself, if something has not changed in the meantime, marks
> another reference that __FILE__ is a public, well defined API.

I think reading #876901 demonstrates that the case can be made either
way; it not as well defined as you make it out to be.


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds

Re: Updating dpkg-buildflags to enable reproducible=+fixfilepath by default

2021-01-09 Thread Vagrant Cascadian
On 2021-01-08, Lisandro Damián Nicanor Pérez Meyer wrote:
> On Fri, 8 Jan 2021 at 21:15, Lisandro Damián Nicanor Pérez Meyer
>  wrote:
> In fact most of those packages would not be unreproducible if the
> environment would be the same as the original build. That includes the
> build path.

True, that is a fairly simple workaround. Which is why we do not vary
the build path when testing bullseye for tests.reproducible-builds.org.

But we do vary build paths when testing experimental and unstable on
tests.reproducible-builds.org, as it helps identify cases where build
paths are an issue.

It would also help greatly as we move towards verification builds of
packages in the archive to not have to worry about build paths as much.


> I do understand that it is *desirable* to be able to reproducibly
> build a package no matter the build path, but that's just desirable.

According to Debian Policy it is recommended:

  "It is recommended that packages produce bit-for-bit identical
   binaries even if most environment variables and build paths are
   varied.  It is intended for this stricter standard to replace the
   above when it is easier for packages to meet it."


> The real fix here is to do the right thing, ie, provide the very same
> environment as the original build, including the build path.

That does sound like a workaround more than a real fix.


> So again, mangling __FILE__ should not be a default.

I'll agree to disagree.


I will admit that a change of defaults in dpkg this close to freeze does
seem a bit on the late side of things. Still, The process has been going
on for months, announced in accordance with the process for getting
defaults changes into dpkg. Bugs with trivial patches were filed in
October.


Unfortunately, most of the affected packages seem to disproportionately
affect packages maintained by the KDE team. I did what I could to
mitigate that impact by actually building each and every one of the
affected packages to ensure that the opt-out patches worked
correctly. Most of those have been applied already.


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds

Re: Updating dpkg-buildflags to enable reproducible=+fixfilepath by default

2021-01-08 Thread Vagrant Cascadian
On 2021-01-08, Lisandro Damián Nicanor Pérez Meyer wrote:
> On Fri, 13 Nov 2020 at 17:40, Vagrant Cascadian
>  wrote:
>>
>> On 2020-11-13, Sune Vuorela wrote:
>> > On 2020-10-27, Vagrant Cascadian  wrote:
>> >> Though, of course, identifying the exact reproducibility problem would
>> >> be preferable. One of the common issues is test suites relying on the
>> >> behavior of __FILE__ returning a full path to find fixtures or other
>> >> test data.
>> >
>> > has QFIND_TESTDATA been adapted to work with this, or are we just
>> > "lucky" that most packages don't actually build and run test suites?
>>
>> Yes, QFINDTESTDATA is one of the primary (only?) issues with test suites
>> found in about 20-30 packages in the archive, according to the
>> archive-wide rebuild that was performed. For most of those packages
>> patches have been submitted, and some are already either fixed or marked
>> as pending.
>
> But QFINDTESTDATA is using __FILE__ in a valid way. It might not be
> what you are expecting, but still a valid usage.
>
> See https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=876901 We have
> discussed this before.
>
>
>> If it could be fixed at the core for QFINDTESTDATA, that would be nicer
>> than fixing 20-30 packages individually, though we're not there right
>> now.
>
> In that case I would expect a valid patch from the people interested
> in enabling this. In the meantime the dpkg change broke a very valid
> usage. Inconvenient for reproducibility? yes, probably, but still very
> valid.

We did a full archive rebuild testing this change, and I provided
patches to all known affected packages several months ago. It is a
one-line change in debian/rules in most cases:

  
https://udd.debian.org/cgi-bin/bts-usertags.cgi?user=reproducible-builds%40lists.alioth.debian.org=fixfilepath

It seems there are less than 10 packages left that have not applied the
patch.

Longer-term, it would be nice to be able to fix QFINDTESTDATA to be
compatible, sure.


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds

Re: Quick look at Reproducible Builds progress in Debian

2021-01-06 Thread Vagrant Cascadian
On 2020-12-07, Vagrant Cascadian wrote:
> I just wanted to spend a few moments looking over the progress
> Reproducible Builds has made in Debian over the last few release cycles.

We've had some good progress since last month!


> The percentage of packages that are reproducible admittedly doesn't look
> amazing, even though there has been steady progress:
>
>   stretch  93.8%
>   buster   94.1%
>   bullseye 94.9%
>   unstable 83.7%

 bullseye 95.2% (+0.3%)
 unstable 84.2% (+0.5%)


> Yet a look at the overall numbers of packages that are reproducible by
> release:
>
>   stretch  23204
>   buster   26740
>   bullseye 28560
>   unstable 26456

  bullseye 28929 (+369)
  unstable 26684 (+228)


> For reference, I'm just looking at and visually comparing:
>
>   
> https://tests.reproducible-builds.org/debian/stretch/index_suite_amd64_stats.html
>   
> https://tests.reproducible-builds.org/debian/buster/index_suite_amd64_stats.html
>   
> https://tests.reproducible-builds.org/debian/bullseye/index_suite_amd64_stats.html
>   
> https://tests.reproducible-builds.org/debian/unstable/index_suite_amd64_stats.html


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds

Re: symbol sizes changing

2020-12-28 Thread Vagrant Cascadian
On 2020-12-28, Nick Black wrote:
> Hey there Reproducible Builds team!
>
> I'd like to make my package "notcurses" reproducible. First off,
> I love the interface at
> https://tests.reproducible-builds.org/debian/rb-pkg/unstable/amd64/notcurses.html,
> the clearest output I've seen yet from diffoscope(?).
>
> All my differences seem to be due to small changes in dynamic
> symbol sizes. See for example data.tar.xz's 'readelf --wide
> --dynamic {}' output:
>
> 18  ·0x000a·(STRSZ)··3446·(bytes)
> 18  ·0x000a·(STRSZ)··3450·(bytes)
>
> What might be causing these symbol sizes to change?

Since it seems to historically consistently build reproducibly in
bullseye, my first guess would be that it is build path related; build
paths are not tested in our infrastructure on bullseye.

In unstable and experimental we do test build paths, but also pass
DEB_BUILD_OPTIONS=reproducible=+fixfilepath, which passes the
-ffile-prefix-map=BUILDPATH=. argument, stripping the build path, but
sometimes leaves traces; the length of the build path is different
(e.g. /1/2/3 vs. /1/2/3/4), so it leaves a different amount of
padded/empty space.


Does notcurses use __FILE__ macros anywhere (either directly in the
source code, or indirectly via some other build system)? Something along
those lines would be needed to completely fix the issue.


> I also have some _NT_GNU_BUILD_ID changes, but I think I can
> handle those myself. Thanks!

The build id will most likely be a result of any differences, such as
those above.


Thanks for asking about reproducible builds in notcurses!


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds

Re: Getting notifications when package reproducibility regresses

2020-12-27 Thread Vagrant Cascadian
On 2020-12-27, John Scott wrote:
> I recall having seen something about this somewhere (wiki?) but haven't found 
> it. If it's still possible for maintainers to sign up for notifications when 
> reproducibility of their packages regress, I'm very much interested.
>
> If you go off package names instead of by maintainer, I maintain
> open-ath9k-htc-firmware.

It's currently a manual process; added you now.

It should eventually show up on:

  https://tests.reproducible-builds.org/debian/index_notify.html

Thanks for taking an interest in keeping your package building
reproducibly!


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds

Re: maintaining reproducible-notes over the long term

2020-12-18 Thread Vagrant Cascadian
On 2020-12-18, Mattia Rizzolo wrote:
> On Fri, Dec 18, 2020 at 12:53:53PM +, Holger Levsen wrote:
>> >  yet also see value in keeping history a little easier to
>> > reach than grepping through git logs.
>> 
>> right.
>> 
>> you also said on irc:
>> 
>> "please wait at least until packages are verified as reproducible in 
>> bullseye on all architectures before removing them from
>> reproducible-notes ... there are some packages which have intermitant
>> reproducibility issues"
>> 
>> which AIUI can be solved by adjusting the scripts Mattia is running
>> in cron.
>
> The thing is, the script removes archived bugs and issues that are not
> reproduced anymore and marked as "deterministic" in issues.yml.
> I read over the diff after that's done and manually remove notes for
> packages that I confirm are indeed fixed (mostly, because the changelog
> says it fixed those issues indeed) and which comments just describe what
> the problem used to be.
>
> Having the script blindly remove the "comment" section would be
> dangerous, as that could easily accidentally remove stuff that was meant
> to be preserved.

Ok. It was unclear to me which parts were automated and which parts were
manual.


> Vagrant: did you perhaps notice that I (manually or automatically?)
> removed notes for stuff that indeed shouldn't have been removed?

Maybe these all fall into the categories you're saying (although dynare
isn't reproducible on arm64, due to non-deterministic issues), although
since they haven't propegated to bullseye yet, I would think it would be
good to leave them open till the bugs are archived, or at least verified
fixed in bullseye?

4ecce9d22a41868b4d0ba2c81e0d25fe81838a2c
https://tests.reproducible-builds.org/debian/rb-pkg/unstable/amd64/bowtie2.html

e6adafa12eb7a9f0a9e4d79a4db7b468a5725d50
https://tests.reproducible-builds.org/debian/rb-pkg/unstable/amd64/dynare.html

a50358f1fe94ea5d2d62b57a716a3a120ef6aacd
https://tests.reproducible-builds.org/debian/rb-pkg/unstable/amd64/cctools.html


Or maybe just had different expectations without the background of when
some of these decisions were made. :)


Thanks for all the maintenance!


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds

maintaining reproducible-notes over the long term

2020-12-11 Thread Vagrant Cascadian
I've often quietly wondered about maintenance of our
"reproducible-notes" issues, and was recently reminded by:

  Remove -ffile-prefix-map tags from packages that do now build
  
https://salsa.debian.org/reproducible-builds/reproducible-notes/-/commit/25e7569e394fb4cf882c30690d26d8da1d8decc4

I had been using this issue to track progress, and still see some value
in a package that was affected by an issue, even though it currently
builds reproducibly now...

But I also run into many issues where historically a package had an
issue and no longer does; which sometimes the issue starts to get
cluttered or makes me wonder if some of them were misflagged.

Yet I often look at the fixed packages for an issue to find potential
solutions for other packages that are still failing. Similarly for
removing bug reports once an issue is archived; sometimes links to these
bug reports contain fixes that are useful beyond that one specific
package fix...

But there is also a signal to noise ratio issue in our notes tracking
with so many old fixed issues, so I can also see value in doing this
sort of housecleaning work...

So not really sure what to think; I see value cleaning and maintaining
current status yet also see value in keeping history a little easier to
reach than grepping through git logs.

Curious what others think about maintaining the reproducible-notes
repository over the long-term...


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds

Quick look at Reproducible Builds progress in Debian

2020-12-07 Thread Vagrant Cascadian
I just wanted to spend a few moments looking over the progress
Reproducible Builds has made in Debian over the last few release cycles.

The percentage of packages that are reproducible admittedly doesn't look
amazing, even though there has been steady progress:

  stretch  93.8%
  buster   94.1%
  bullseye 94.9%
  unstable 83.7%

So, we've seen ~1% increase in reproducible builds over two release
cycles since stretch, and despite being steady progress overall, it
might feel a bit disappointing... but we also have a somewhat higher bar
for bullseye (e.g. varied merged /usr) and much higher standard for
unstable (e.g. varied build path).


Yet a look at the overall numbers of packages that are reproducible by
release:

  stretch  23204
  buster   26740
  bullseye 28560
  unstable 26456

Over the last two releases, that is over 5000 additional packages that
are reproducible. There are more *reproducible* packages in bullseye
than the *total* number of packages in buster! Unstable has nearly the
same number of reproducible packages as buster, despite a wider variety
of variations, and more than 3000 more than stretch.


And while we track issues by how many source packages are reproducible,
I am fairly confident that many packages have had partial fixes applied
that result in more reproducible binary packages, even if not all of the
packages of a given source are reproducible. We unfortunately do not
track that to demonstrate numbers, but I know it has happened for at
least a few patches I've submitted.


So, while Reproducible Builds in Debian is dealing with the challenges
of a "last mile" problem (please forgive my use of an anacronistic
measuring system), we're also keeping pace with thousands of newly
introduced packages yet still gradually and steadily pulling ever so
slightly further ahead!


For reference, I'm just looking at and visually comparing:

  
https://tests.reproducible-builds.org/debian/stretch/index_suite_amd64_stats.html
  
https://tests.reproducible-builds.org/debian/buster/index_suite_amd64_stats.html
  
https://tests.reproducible-builds.org/debian/bullseye/index_suite_amd64_stats.html
  
https://tests.reproducible-builds.org/debian/unstable/index_suite_amd64_stats.html


Keep building reproducibly!


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds

Office Hours / Ask Me Anything 2021-01-07 18:00-20:00 UTC

2020-12-07 Thread Vagrant Cascadian
We will set aside some time to be available for asking questions about
anything related to Reproducible Builds.

This is an opportunity to ask introductory questions and is intended to
be welcoming to newcomers, though of course, any questions relating to
Reproducible Builds should be fair game!


We had fun at the first session:

  
http://meetbot.debian.net/reproducible-builds/2020/reproducible-builds.2020-11-30-17.19.log.html

So we are going to try this again!


Our next session is planned for January 7th, 18:00 UTC going until 20:00
UTC:

  https://time.is/compare/1800_07_Jan_2021_in_UTC


The location will be irc.oftc.net in the #reproducible-builds
channel. If you are new to IRC, there is a web interface available:

  https://webchat.oftc.net/?channels=reproducible-builds


Remember to wait for a few minutes after asking a question, to give
people a chance to respond. Once you have joined the channel, even if
there is a conversation already going, jump right in, no need to ask
permission to speak!


Please share this with anyone or with any networks where you think there
might be people interested in Reproducible Builds.


Thanks!


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds

Re: Bug#900837: update on mass-rebuild of packages for reproducible builds

2020-12-03 Thread Vagrant Cascadian
On 2019-03-05, Holger Levsen wrote:
> I ran Chris's script again on coccia, with the result that currently
> 6084 source packages in the archive need a rebuild for reproducible
> builds, as they were built with an old dpkg version not producing
> .buildinfo files.

I ran it just now, and we're down to 3433! Still a ways to go, but
getting there...

Updated list attached.

live well,
  vagrant
abego-treelayout
abicheck
accelio
adacontrol
aiksaurus
aiowsgi
airlift-slice
album
alqalam
amispammer
amoeba-data
amphetamine-data
amqp-specs
android-platform-external-rappor
angband-audio
apertium-oci
apertium-szl
api-sanity-checker
apt-config-auto-update
apt-src
apt-venv
archmbox
arduino
argparse4j
args4j
aribas
arrayfire
asciimathtml
asis
aspell-am
aspell-cs
aspell-cy
aspell-el
aspell-fr
aspell-hr
aspell-hsb
aspell-hu
aspell-hy
aspell-is
aspell-kk
aspell-pl
aspell-ro
aspell-sl
aspell-sv
aspell-uz
asql
asterisk-moh-opsound
asterisk-prompt-de
asterisk-prompt-es-co
asterisk-prompt-fr-armelle
asterisk-prompt-fr-proformatique
asterisk-prompt-it
astrodendro
astrometry-data-2mass
at-at-clojure
atmel-firmware
autoconf2.64
autodns-dhcp
autofill-forms
automaton
awit-dbackup
axmlrpc
bar-cursor-el
beancounter
beckon-clojure
bidiui
binstats
biomaj3-zipkin
blag-fortune
blosxom
bmt
boilerpipe
brag
brebis
broctl
bsaf
bsdowl
bsfilter
bytecode
c3
caffeine-cache
calculix-ccx-doc
calculix-ccx-test
carmetal
casacore-data-lines
casacore-data-sources
cava
cclive
cdlabelgen
cecil
cecil-flowanalysis
cereal
cfi
chartkick.js
checkbot
checksecurity
cheshire-clojure
chewmail
chronicle
cl-abnf
cl-asdf-finalizers
cl-asdf-flv
cl-asdf-system-connections
cl-closure-common
cl-command-line-arguments
cl-containers
cl-cxml
cl-daemon
cl-dynamic-classes
cl-garbage-pools
cl-github-v3
cl-launch
cl-log
cl-lparallel
cl-markdown
cl-metatilities-base
cl-mustache
cl-quri
cl-rfc2388
cl-salza2
cl-sqlite
cl-trivial-utf-8
cl-umlisp
cl-umlisp-orf
cl-utilities
cl-uuid
cl-yason
cl-zip
clamassassin
clamav-unofficial-sigs
class.js
classycle
clc-intercal
clj-http-clojure
clout-clojure
coco-cs
code2html
combat
commons-exec
commons-pool
compass-toolkit-plugin
compojure-clojure
concurrent-dfsg
configure-debian
conmux
cons
console-cyrillic
courier-filter-perl
couriergraph
cpath-clojure
crafty-bitmaps
crafty-books-medium
crafty-books-medtosmall
crafty-books-small
cream
crossfire-maps
crossfire-maps-small
crypto-equality-clojure
crypto-policies
crypto-random-clojure
cryptojs
crystalcursors
csmash-demosong
css3pie
ctapi
cthumb
ctop
curvesapi
cvs-buildpackage
cvsdelta
cvsutils
cvsweb
d3-tip.js
dailystrips
daptup
db4o
dbix-easy-perl
dbus-sharp
dbus-sharp-glib
dd-plist
ddtc
debaux
debdate
debdry
debget
debian-dad
debpear
debroster
delimmatch
devpi-common
dgen
dh-rebar
dhcpy6d
dia-shapes
dict-jargon
dictem
dictzip-java
dirgra
discover-data
dish
dizzy
django-hvad
django-measurement
dkimproxy
dl10n
dnlib
dns-browse
dns323-firmware-tools
doc-debian
doc-linux-fr
docbook
docbook-dsssl
docbook-ebnf
docbook-html-forms
docbook-mathml
docbook-slides
docbook-slides-demo
docbook-utils
docbook-website
docbook-xsl-doc
dokujclient
doom-wad-shareware
drbdlinks
drgeo-doc
drraw
dtd-parser
dtdparse
durep
dvcs-autosync
dvd-slideshow
dvi2ps-fontdata
dvi2ps-fontdesc-morisawa5
dynalang
dynamips
ebook-dev-alp
ecaccess
eclipselink-jpa-2.1-spec
edb
edgar
edict-el
ehcache
eigenbase-farrago
eigenbase-resgen
el-get
elektra
elida
elscreen
ember-media
emdebian-archive-keyring
epic4-help
es-module-loader-0.17.js
esnacc
etoys
evenement
explorercanvas
extra-xdg-menus
faenza-icon-theme
famfamfam-flag
famfamfam-silk
fasd
fast-zip-clojure
fb-music-high
fccexam
feed2imap
felix-framework
felix-gogo-command
felix-gogo-runtime
felix-gogo-shell
fest-assert
fest-test
fest-util
festival-ca
festvox-don
festvox-kallpc8k
festvox-kdlpc16k
festvox-kdlpc8k
festvox-rablpc8k
fetchyahoo
fheroes2-pkg
file-mmagic
filepp
fillets-ng-data
finance-yahooquote
fizmo
flake8-docstrings
flamethrower
flare
flask-mail
flatlatex
flotr
fluid-soundfont
flup
focalinux
foiltex
fonts-alegreya-sans
fonts-arkpandora
fonts-averia-gwf
fonts-averia-sans-gwf
fonts-averia-serif-gwf
fonts-babelstone-modern
fonts-crosextra-carlito
fonts-ddc-uchen
fonts-dzongkha
fonts-hosny-thabit
fonts-inconsolata
fonts-johnsmith-induni
fonts-junction
fonts-kaushanscript
fonts-khmeros
fonts-klaudia-berenika
fonts-kristi
fonts-larabie
fonts-leckerli-one
fonts-lg-aboriginal
fonts-lklug-sinhala
fonts-lobstertwo
fonts-nafees
fonts-ocr-b
fonts-oflb-asana-math
fonts-open-sans
fonts-quattrocento
fonts-radisnoir
fonts-sambhota-tsugring
fonts-sambhota-yigchung
fonts-sil-tagmukay
fonts-stix
fonts-tibetan-machine
fonts-tuffy
fonts-ubuntu-title
fonts-ukij-uyghur
fonty-rg
fort77
fortunes-bg
fortunes-br
fortunes-cs
fortunes-eo
fortunes-fr
fortunes-ga
fortunes-ru
freetable
fretsonfire-songs-muldjord
fretsonfire-songs-sectoid
fsharp
funnelweb-doc
funny-manpages
fuzzysort
galib
games-thumbnails
gap-transgrp
gav-themes
gearman-server
gedit-latex-plugin
geki3
genesisplusgx

"Office Hours / Ask Me Anything" 2020-11-30 17:00-20:00 UTC

2020-11-25 Thread Vagrant Cascadian
Hi!

We are experimenting with setting aside some time to be available for
asking questions about anything related to Reproducible Builds.

This is an opportunity to ask introductory questions and is intended to
be welcoming to newcomers, though of course, any questions relating to
Reproducible Builds should be fair game!

Our first session is planned for November 30th, 17:00 UTC going until
20:00 UTC:

  https://time.is/compare/1700_30_Nov_2020_in_UTC

The location will be irc.oftc.net in the #reproducible-builds
channel. If you are new to IRC, there is a web interface available:

  https://oftc.net/WebChat/

Please share this with anyone or with any networks where you think there
might be people interested in Reproducible Builds.


Thanks!


live well,
  vagrant


signature.asc
Description: PGP signature
___
Reproducible-builds mailing list
Reproducible-builds@alioth-lists.debian.net
https://alioth-lists.debian.net/cgi-bin/mailman/listinfo/reproducible-builds

  1   2   >