I spent a couple of hours this w/e getting claude to answer the question
for me "do the x86 and aarch JAR files differ?
The 3.4.3 release ended up using the arm build as its source of jar files
because using the cloud x86 vm I was using for release produced multiple
staging repos in apache nexus, something suggested to be VPN/IP address
related: nexus saw requests coming in from different source IP addresses
and so assigned the artifacts to different repos.

I was curious about whether the binaries were different, and also whether
it'd be possible to detect and defend against malicious release managers.
That is: if I'd added a back door into the code, would it have been
detected.

Here then is my auditor
https://github.com/steveloughran/auditor

And here are the results.
https://gist.github.com/steveloughran/d3c9ad6a718bfec68085b08584ae414e

The main issue to flag is that in hadoop common, the protobuf classes are
somehow different. leveldbjni is different too, which is interesting but
not too concerning, though "auditor" does flag the different code is
looking at the system environment so extra suspicious.

I doubt thats new; just something we've never noticed before. Whatever the
native protoc compilers are doing, they seem to be generating different
classes, even with the same shaded protobuf being used.

Reply via email to