This is an automated email from the ASF dual-hosted git repository. stevel pushed a commit to branch main in repository https://gitbox.apache.org/repos/asf/hadoop-release-support.git
The following commit(s) were added to refs/heads/main by this push: new f512f99 HADOOP-19018. releasing f512f99 is described below commit f512f994a9269edc1248e2e85ada21d97c2ce135 Author: Steve Loughran <ste...@cloudera.com> AuthorDate: Wed Feb 21 19:51:40 2024 +0000 HADOOP-19018. releasing --- README.md | 96 ++++++++++++++++++++++-------- build.xml | 87 ++++++++++++--------------- pom.xml | 94 ++++++++++++----------------- src/releases/release-info-3.5.0.properties | 34 +++++++++++ 4 files changed, 181 insertions(+), 130 deletions(-) diff --git a/README.md b/README.md index 81ed6b1..a38992e 100644 --- a/README.md +++ b/README.md @@ -23,8 +23,39 @@ There is a maven `pom.xml` file. This is used to validate the dependencies from staging repositories as well as run some basic tests to validate the classpath. +# Prerequisites -## Files +Installed applications/platforms + +Java 8+. Later releases are valid for validation too. + +Apache Ant. + +To use the scp/ssh commands we n +``` +ant -diagnostics +``` + +BAD +``` +------------------------------------------- + Tasks availability +------------------------------------------- +sshsession : Missing dependency com.jcraft.jsch.Logger +scp : Missing dependency com.jcraft.jsch.Logger +sshexec : Missing dependency com.jcraft.jsch.Logger +``` + +Here are the apt-get commands to set up a raspberry pi for the arm validation +```bash +apt-get install openjdk-17-jdk +apt-get install ant libjsch-java +apt-get install gpgv +apt-get install maven +apt-get install subversion +``` + +# Files ### `/build.xml` @@ -147,7 +178,7 @@ This is critical when validating downstream project builds. ant purge-from-maven ``` -### Download RC to `target/incoming` +### SCP RC down to `target/incoming` This will take a while! look in target/incoming for progress @@ -165,7 +196,7 @@ ant copy-scp-artifacts release.dir.check The `release.dir.check` target just lists the directory. -### Arm64 binaries +### Building Arm64 binaries If arm64 binaries are being created then they must be built on an arm docker image. @@ -176,7 +207,8 @@ The arm process is one of 1. Create the full set of artifacts on an arm machine (macbook, cloud vm, ...) 2. Use the ant build to copy and rename the `.tar.gz` with the native binaries only 3. Create a new `.asc `file. -4. Generate new sha512 checksum file containing the new name. +4. Generate new `.sha512` checksum file containing the new name. + Renaming the old file is not sufficient. 5. Move these files into the `downloads/release/$RC` dir To perform these stages, you need a clean directory of the same @@ -188,7 +220,7 @@ In `build.properties` declare its location arm.hadoop.dir=/Users/stevel/hadoop/release/hadoop ``` -In that dir, create the relese. +In that dir, create the release. ```bash time dev-support/bin/create-release --docker --dockercache --mvnargs="-Dhttp.keepAlive=false -Dmaven.wagon.http.pool=false" --deploy --native --sign @@ -209,12 +241,6 @@ ant arm.release release.dir.check ``` -### verify gpg signing - -```bash -ant gpg.keys gpg.verify -``` - ### copy to a staging location in the hadoop SVN repository. When committed to subversion it will be uploaded and accessible via a @@ -304,16 +330,22 @@ http.source=https://dist.apache.org/repos/dist/dev/hadoop/hadoop-${hadoop.versio ### Targets of Relevance -| target | action | -|----------------------|----------------------------| -| `release.fetch.http` | fetch artifacts | -| `release.dir.check` | verify release dir exists | -| `release.src.untar` | untar retrieved artifacts | -| `release.src.build` | build the source | -| `release.src.test` | build and test the source | -| `gpg.keys` | import the hadoop KEYS | -| `gpg.verify` | verify the D/L'd artifacts | -| | | +| target | action | +|-------------------------|------------------------------------------------------------| +| `release.fetch.http` | fetch artifacts | +| `gpg.keys` | import the hadoop KEYS | +| `gpg.verify` | verify the signature of the retrieved artifacts | +| `release.dir.check` | verify release dir exists | +| `release.src.untar` | untar retrieved artifacts | +| `release.src.build` | build the source; call `release.src.untar` first | +| `release.src.test` | build and test the source; call `release.src.untar` first | +| `release.bin.untar` | untar the binary file | +| `release.bin.commands` | execute a series of commands against the untarred binaries | +| `release.site.untar` | untar the downloaded site artifact | +| `release.site.validate` | perform minimal validation of the site. | +| `release.arm.untar` | untar the ARM binary file | +| `release.arm.commands` | execute commands against the arm binaries | +| | | set `check.native.binaries` to false to skip native binary checks on platforms without them @@ -324,6 +356,20 @@ Downloads under `downloads/incoming` ant release.fetch.http ``` + +### verify gpg signatures + +```bash +ant gpg.keys gpg.verify +``` +This will import all the KEYS from +[https://downloads.apache.org/hadoop/common/KEYS](https://downloads.apache.org/hadoop/common/KEYS), +then verify the signature of each downloaded file. + +If you don't yet trust the key of whoever signed the release then +now is the time to use the keytool to declare that you trust them +-after performing whatever verification you consider is wise. + ### untar source and build. This puts the built artifacts into the local maven repo so @@ -335,11 +381,11 @@ ant release.src.untar release.src.build ``` -### untar site and build. +### untar site and validate. ```bash -ant release.site.untar +ant release.site.untar release.site.validate ``` @@ -568,7 +614,7 @@ change the ``` Check that release URL in your browser. -## Publish nexus artifacts +## Publish Maven artifacts do this at [https://repository.apache.org/#stagingRepositories](https://repository.apache.org/#stagingRepositories) @@ -672,7 +718,7 @@ For some unknown reason the parquet build doesn't seem to cope. ``` -# Rolling back an RC +# Aborting an RC Drop the staged artifacts from nexus [https://repository.apache.org/#stagingRepositories](https://repository.apache.org/#stagingRepositories) diff --git a/build.xml b/build.xml index 22148c1..d93499b 100644 --- a/build.xml +++ b/build.xml @@ -214,6 +214,19 @@ </gpg> </presetdef> + <!-- verify a file in the release dir; + automatically adds the .asc suffix --> + <macrodef name="gpgverify" > + <attribute name="name" /> + <sequential> + <echo>Verifying GPG signature of ${release.dir}/@{name}</echo> + <gpg dir="${release.dir}"> + <arg value="--verify"/> + <arg value="@{name}.asc"/> + </gpg> + </sequential> + </macrodef> + <presetdef name="svn"> <x executable="svn"/> </presetdef> @@ -349,8 +362,7 @@ <target name="gpg.keys" depends="init" - description="fetch GPG keys"> - + description="fetch and import GPG keys"> <gpg> <arg value="--fetch-keys"/> <arg value="https://downloads.apache.org/hadoop/common/KEYS"/> @@ -360,41 +372,14 @@ <target name="gpg.verify" depends="release.dir.check" description="verify the downloaded artifacts"> - <echo>Verifying GPG signatures of artifacts in ${release.dir}</echo> - <gpgv> - <arg value="--verify"/> - <arg value="${release}-src.tar.gz.asc"/> - </gpgv> - <gpgv> - <arg value="--verify"/> - <arg value="${release}-site.tar.gz.asc"/> - </gpgv> - <gpgv> - <arg value="--verify"/> - <arg value="${release}.tar.gz.asc"/> - </gpgv> - - <gpgv> - <arg value="--verify"/> - <arg value="${release}-rat.txt.asc"/> - </gpgv> - - <gpgv> - <arg value="--verify"/> - <arg value="RELEASENOTES.md.asc"/> - </gpgv> - - <gpgv> - <arg value="--verify"/> - <arg value="CHANGELOG.md.asc"/> - </gpgv> - - <gpgv> - <arg value="--verify"/> - <arg value="${arm.binary.filename}.asc"/> - </gpgv> - + <gpgverify name="CHANGELOG.md"/> + <gpgverify name="RELEASENOTES.md"/> + <gpgverify name="${release}-src.tar.gz"/> + <gpgverify name="${release}-site.tar.gz"/> + <gpgverify name="${release}.tar.gz"/> + <gpgverify name="${release}-rat.txt"/> + <gpgverify name="${arm.binary.filename}"/> </target> @@ -991,16 +976,6 @@ Message is in file ${message.out} usetimestamp="true"/> </target> - <target name="release.site.untar" depends="release.dir.check" - description="untar the release site"> - <echo>untarring site ${release.dir}/${release}-site.tar.gz</echo> - <mkdir dir="target/untar"/> - - <gunzip src="${release.dir}/${release}-site.tar.gz" dest="target/untar"/> - <untar src="target/untar/${release}-site.tar" dest="${release.site.dir}" /> - <echo>site is under ${release.site.dir}</echo> - </target> - <target name="release.src.untar" depends="release.dir.check" description="untar the release source"> <echo>untarring source ${release.dir}/${release}-src.tar.gz to ${release.source.dir}</echo> @@ -1279,16 +1254,30 @@ Message is in file ${message.out} </target> + + <target name="release.site.untar" depends="release.dir.check" + description="untar the release site"> + <echo>untarring site ${release.dir}/${release}-site.tar.gz</echo> + <mkdir dir="target/untar"/> + + <gunzip src="${release.dir}/${release}-site.tar.gz" dest="target/untar"/> + <untar src="target/untar/${release}-site.tar" dest="${release.site.dir}" /> + <echo>site is under ${release.site.dir}</echo> + </target> + <!-- check the untarred site [--> - <target name="site.validate" + <target name="release.site.validate" depends="init" description="validate the API docs in the site. download/untar the site first"> - <echo>validate site docs. run release.site.untar first</echo> + <echo>validating site docs; you must have run release.site.untar first</echo> <require-dir path="${site.dir}"/> + <require-file path="${site.dir}/index.html"/> <require-dir path="${site.dir}/api"/> <require-file path="${site.dir}/api/index.html"/> - + <echo>Basic validation completed; view in browser at + file://${site.dir}/index.html + </echo> </target> diff --git a/pom.xml b/pom.xml index 58fd7f3..e42e556 100644 --- a/pom.xml +++ b/pom.xml @@ -32,10 +32,6 @@ <hadoop.version>3.4.0</hadoop.version> - <!-- SLF4J/LOG4J version --> - <slf4j.version>1.7.36</slf4j.version> - <reload4j.version>1.2.22</reload4j.version> - </properties> <dependencies> @@ -150,33 +146,64 @@ <artifactId>hadoop-cloud-storage</artifactId> <version>${hadoop.version}</version> </dependency> -<!-- why not? - +<!-- <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>federation-balance</artifactId> <version>${hadoop.version}</version> </dependency> ---> <dependency> <groupId>org.apache.hadoop</groupId> - <artifactId>hadoop-resourceestimator</artifactId> + <artifactId>hadoop-hadoop-fs2img</artifactId> + <version>${hadoop.version}</version> + </dependency> +--> + <dependency> + <groupId>org.apache.hadoop</groupId> + <artifactId>hadoop-kafka</artifactId> <version>${hadoop.version}</version> </dependency> + <dependency> <groupId>org.apache.hadoop</groupId> - <artifactId>hadoop-sls</artifactId> + <artifactId>hadoop-extras</artifactId> <version>${hadoop.version}</version> </dependency> + + <dependency> + <groupId>org.apache.hadoop</groupId> + <artifactId>hadoop-gridmix</artifactId> + <version>${hadoop.version}</version> + </dependency> + <dependency> + <groupId>org.apache.hadoop</groupId> + <artifactId>hadoop-archives</artifactId> + <version>${hadoop.version}</version> + </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-archive-logs</artifactId> <version>${hadoop.version}</version> </dependency> + + + + <dependency> + <groupId>org.apache.hadoop</groupId> + <artifactId>hadoop-resourceestimator</artifactId> + <version>${hadoop.version}</version> + </dependency> + + <dependency> + <groupId>org.apache.hadoop</groupId> + <artifactId>hadoop-sls</artifactId> + <version>${hadoop.version}</version> + </dependency> + <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-datajoin</artifactId> @@ -194,6 +221,7 @@ <artifactId>hadoop-dynamometer-blockgen</artifactId> <version>${hadoop.version}</version> </dependency> + <dependency> <groupId>org.apache.hadoop</groupId> @@ -207,53 +235,13 @@ <version>${hadoop.version}</version> </dependency> - - <dependency> <groupId>junit</groupId> <artifactId>junit</artifactId> <version>4.13.2</version> <scope>test</scope> </dependency> - <!-- - <dependency> - <groupId>org.slf4j</groupId> - <artifactId>slf4j-api</artifactId> - <version>${slf4j.version}</version> - </dependency> - <dependency> - <groupId>org.slf4j</groupId> - <artifactId>slf4j-reload4j</artifactId> - <version>${slf4j.version}</version> - </dependency> - <dependency> - <groupId>ch.qos.reload4j</groupId> - <artifactId>reload4j</artifactId> - <version>${reload4j.version}</version> - <exclusions> - <exclusion> - <groupId>com.sun.jdmk</groupId> - <artifactId>jmxtools</artifactId> - </exclusion> - <exclusion> - <groupId>com.sun.jmx</groupId> - <artifactId>jmxri</artifactId> - </exclusion> - <exclusion> - <groupId>javax.mail</groupId> - <artifactId>mail</artifactId> - </exclusion> - <exclusion> - <groupId>javax.jms</groupId> - <artifactId>jmx</artifactId> - </exclusion> - <exclusion> - <groupId>javax.jms</groupId> - <artifactId>jms</artifactId> - </exclusion> - </exclusions> - </dependency> - --> + </dependencies> <!-- @@ -303,12 +291,6 @@ </repository> </repositories> </profile> - <profile> - <id>hadoop-3.3.5</id> - <properties> - <hadoop.version>3.3.5</hadoop.version> - </properties> - </profile> <profile> <id>3.4</id> diff --git a/src/releases/release-info-3.5.0.properties b/src/releases/release-info-3.5.0.properties new file mode 100644 index 0000000..d425b88 --- /dev/null +++ b/src/releases/release-info-3.5.0.properties @@ -0,0 +1,34 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# https://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +# property file for 3.5.0 release +# this is more to test the release process than to actually release 3.5.0 +hadoop.version=3.5.0 +rc=RC0 +previous.version=3.3.6 +release.branch=trunk +git.commit.id=trunk + +jira.id=HADOOP-TBD +jira.title=Release 3.5.0 + +#amd.src.dir=https://dist.apache.org/repos/dist/dev/hadoop/hadoop-3.4.0-RC2 +#arm.src.dir=${amd.src.dir} +#http.source=${amd.src.dir} +#asf.staging.url=https://repository.apache.org/content/repositories/orgapachehadoop-1402 + +cloudstore.profile=sdk2 + --------------------------------------------------------------------- To unsubscribe, e-mail: common-commits-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-commits-h...@hadoop.apache.org