Hi, all

Since the hadoop-3.4.0-RC0 vote, I have received valuable feedback. I
encountered some issues during the preparation of hadoop-3.4.0-RC0, and I
will address these issues in hadoop-3.4.0-RC1.

The voting for RC0 will be closed. After the release of RC1, I will invite
members of the community to review and vote once again.

Thank you all once again for your support!

Best Regards,
Shilun Fan.

On Wed, Jan 17, 2024 at 9:03 AM slfan1989 <slfan1...@apache.org> wrote:

> Thank you very much for the response!
>
> The content is very comprehensive and valuable.
>
> I will prepare hadoop-3.4.0-RC1 according to the instructions provided by
> you, and after RC1 is packaged, I will use validate-hadoop-client-artifacts
> for validation.
>
> Best Regards,
> Shilun Fan.
>
> On Tue, Jan 16, 2024 at 12:34 AM Steve Loughran
> <ste...@cloudera.com.invalid> wrote:
>
>> -1 I'm afraid, just due to staging/packaging issues.
>>
>> This took me a few goes to get right myself, so nothing unusual.
>>
>> Note I used my validator project which is set to retrieve binaries, check
>> signatures, run maven builds against staged artifacts *and clean up any
>> local copies first*and more.
>>
>> This uses apache ant to manage all this:
>>
>> https://github.com/steveloughran/validate-hadoop-client-artifacts
>>
>> Here's the initial build.properties:file I used to try and manage this
>>
>> ###### build.properties:
>> hadoop.version=3.4.0
>> rc=RC0
>> amd.src.dir=https://home.apache.org/~slfan1989/hadoop-3.4.0-RC0-amd64/
>> http.source=https://home.apache.org/~slfan1989/hadoop-3.4.0-RC0-amd64
>> <https://home.apache.org/~slfan1989/hadoop-3.4.0-RC0-amd64/http.source=https://home.apache.org/~slfan1989/hadoop-3.4.0-RC0-amd64>
>>
>> release=hadoop-${hadoop.version}-RC0
>> rc.dirname=${release}
>> release.native.binaries=false
>> git.commit.id=cdb8af4f22ec
>> nexus.staging.url=
>> https://repository.apache.org/content/repositories/orgapachehadoop-1391/
>> hadoop.source.dir=${local.dev.dir}/hadoop-trunk
>> ######
>>
>> When I did my own builds, all the artifacts created were without the RC0
>> suffix. It is critical this happens because the .sha512 checksums include
>> that in their paths
>>
>> > cat hadoop-3.4.0-RC0.tar.gz.sha512
>> SHA512 (hadoop-3.4.0-RC0.tar.gz) =
>>
>> e50e68aecb36867c610db8309ccd3aae812184da21354b50d2a461b29c73f21d097fb27372c73c150e1c035003bb99a61c64db26c090fe0fb9e7ed6041722eab
>>
>>
>> Maven artifacts: staging problems
>>
>> Couldn't build with a -Pstaging profile as the staging repository wasn't
>> yet closed -I tried to do that myself.
>>
>> This failed with some rule problem
>>
>> Event: Failed: Checksum Validation
>> Monday, January 15, 2024 14:37:13 GMT (GMT+0000)
>> typeId checksum-staging
>> failureMessage INVALID SHA-1:
>>
>> '/org/apache/hadoop/hadoop-mapreduce-client-jobclient/3.4.0/hadoop-mapreduce-client-jobclient-3.4.0-tests.jar.sha1'
>> failureMessage Requires one-of SHA-1:
>>
>> /org/apache/hadoop/hadoop-mapreduce-client-jobclient/3.4.0/hadoop-mapreduce-client-jobclient-3.4.0-tests.jar.sha1,
>> SHA-256:
>>
>> /org/apache/hadoop/hadoop-mapreduce-client-jobclient/3.4.0/hadoop-mapreduce-client-jobclient-3.4.0-tests.jar.sha256,
>> SHA-512:
>>
>> /org/apache/hadoop/hadoop-mapreduce-client-jobclient/3.4.0/hadoop-mapreduce-client-jobclient-3.4.0-tests.jar.sha512
>>
>> I don't know precisely what this means...my guess is that the upload
>> didn't
>> include everything.
>>
>> Note my client-validator module can check this; just run its maven test
>> commands
>>
>> mvn clean test -U -P3.4 -Pstaging
>>
>> GPG signing: all good.
>>
>> Picked your key up from the site ( ant gpg.keys ) ... first validation
>> with
>> ant gpg.verify was unhappy as your key wasn't trusted. I've signed it and
>> pushed that signature up, so people who trust me get some reassurance
>> about
>> you.
>>
>> My build then failed as the gpg code couldn't find the
>> hadoop-3.4.0-aarch64.tar.gz.asc
>>
>> The problem here is that although we want separate arm and x86 tar files,
>> we don't really want separate binaries as it only creates different jars
>> in
>> the wild.
>>
>> The way I addressed that was after creating that x86 release on an ec2 vm
>> and downloading it, I then did a local arm64 build and then created an arm
>> .tar.gz file, copied it into the same dir as the amd66 binaries but with
>> the arm64 .tar.gz filename, .asc and .sha512 checksum files all renamed
>> (checksum file patches to match the name).
>>
>>
>> https://github.com/steveloughran/validate-hadoop-client-artifacts?tab=readme-ov-file#arm64-binaries
>>
>

Reply via email to