Richard Stallman wrote:
[[[ To any NSA and FBI agents reading my email: please consider    ]]]
[[[ whether defending the US Constitution against all enemies,     ]]]
[[[ foreign or domestic, requires you to follow Snowden's example. ]]]

  > What would be helpful is if `make dist' would guarantee to produce the same
  > tarball (bit-to-bit) each time it is run, assuming the tooling is the same
  > version.  Currently I believe that is not the case (at least due to 
timestamps)

Isn't this a description of "reproducible compilation"?

No, but it is closely related. Compilation produces binary executables, while `make dist` produces a freestanding /source/ archive.

We want to make that standard, but progress is inevitably slow
because many packages need to be changed.

I am not actually sure that that is actually a good idea. (Well, it is mostly a good idea except for one issue.) If compilation is strictly deterministic, then everyone ends up with identical binaries, which means an exploit that cracks one will crack all. Varied binaries make life harder for crackers developing exploits, and may even make "one exploit to crack them all" impossible. This is one of the reasons that exploits have long hit Windows (where all the systems are identical) so much harder than the various GNU/Linux distributions (where the binaries are likely different even before distribution-specific patches are considered).

Ultimately, this probably means that we should have both an /ability/ for deterministic compilation and either a compiler mode or post-processing pass (a linker option?) to intentionally shuffle the final executable.


-- Jacob


Reply via email to