[ 
https://issues.apache.org/jira/browse/FLINK-33834?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jing Ge resolved FLINK-33834.
-----------------------------
    Resolution: Fixed

> Build and stage Java and Python artifacts
> -----------------------------------------
>
>                 Key: FLINK-33834
>                 URL: https://issues.apache.org/jira/browse/FLINK-33834
>             Project: Flink
>          Issue Type: Sub-task
>            Reporter: Jing Ge
>            Assignee: Jing Ge
>            Priority: Major
>
> # Create a local release branch ((!) this step can not be skipped for minor 
> releases):
> {code:bash}
> $ cd ./tools
> tools/ $ OLD_VERSION=$CURRENT_SNAPSHOT_VERSION NEW_VERSION=$RELEASE_VERSION 
> RELEASE_CANDIDATE=$RC_NUM releasing/create_release_branch.sh
> {code}
>  # Tag the release commit:
> {code:bash}
> $ git tag -s ${TAG} -m "${TAG}"
> {code}
>  # We now need to do several things:
>  ## Create the source release archive
>  ## Deploy jar artefacts to the [Apache Nexus 
> Repository|https://repository.apache.org/], which is the staging area for 
> deploying the jars to Maven Central
>  ## Build PyFlink wheel packages
> You might want to create a directory on your local machine for collecting the 
> various source and binary releases before uploading them. Creating the binary 
> releases is a lengthy process but you can do this on another machine (for 
> example, in the "cloud"). When doing this, you can skip signing the release 
> files on the remote machine, download them to your local machine and sign 
> them there.
>  # Build the source release:
> {code:bash}
> tools $ RELEASE_VERSION=$RELEASE_VERSION releasing/create_source_release.sh
> {code}
>  # Stage the maven artifacts:
> {code:bash}
> tools $ releasing/deploy_staging_jars.sh
> {code}
> Review all staged artifacts ([https://repository.apache.org/]). They should 
> contain all relevant parts for each module, including pom.xml, jar, test jar, 
> source, test source, javadoc, etc. Carefully review any new artifacts.
>  # Close the staging repository on Apache Nexus. When prompted for a 
> description, enter “Apache Flink, version X, release candidate Y”.
> Then, you need to build the PyFlink wheel packages (since 1.11):
>  # Set up an azure pipeline in your own Azure account. You can refer to 
> [Azure 
> Pipelines|https://cwiki.apache.org/confluence/display/FLINK/Azure+Pipelines#AzurePipelines-Tutorial:SettingupAzurePipelinesforaforkoftheFlinkrepository]
>  for more details on how to set up azure pipeline for a fork of the Flink 
> repository. Note that a google cloud mirror in Europe is used for downloading 
> maven artifacts, therefore it is recommended to set your [Azure organization 
> region|https://docs.microsoft.com/en-us/azure/devops/organizations/accounts/change-organization-location]
>  to Europe to speed up the downloads.
>  # Push the release candidate branch to your forked personal Flink 
> repository, e.g.
> {code:bash}
> tools $ git push <remote> 
> refs/heads/release-${RELEASE_VERSION}-rc${RC_NUM}:release-${RELEASE_VERSION}-rc${RC_NUM}
> {code}
>  # Trigger the Azure Pipelines manually to build the PyFlink wheel packages
>  ## Go to your Azure Pipelines Flink project → Pipelines
>  ## Click the "New pipeline" button on the top right
>  ## Select "GitHub" → your GitHub Flink repository → "Existing Azure 
> Pipelines YAML file"
>  ## Select your branch → Set path to "/azure-pipelines.yaml" → click on 
> "Continue" → click on "Variables"
>  ## Then click "New Variable" button, fill the name with "MODE", and the 
> value with "release". Click "OK" to set the variable and the "Save" button to 
> save the variables, then back on the "Review your pipeline" screen click 
> "Run" to trigger the build.
>  ## You should now see a build where only the "CI build (release)" is running
>  # Download the PyFlink wheel packages from the build result page after the 
> jobs of "build_wheels mac" and "build_wheels linux" have finished.
>  ## Download the PyFlink wheel packages
>  ### Open the build result page of the pipeline
>  ### Go to the {{Artifacts}} page (build_wheels linux -> 1 artifact)
>  ### Click {{wheel_Darwin_build_wheels mac}} and {{wheel_Linux_build_wheels 
> linux}} separately to download the zip files
>  ## Unzip these two zip files
> {code:bash}
> $ cd /path/to/downloaded_wheel_packages
> $ unzip wheel_Linux_build_wheels\ linux.zip
> $ unzip wheel_Darwin_build_wheels\ mac.zip{code}
>  ## Create directory {{./dist}} under the directory of {{{}flink-python{}}}:
> {code:bash}
> $ cd <flink-dir>
> $ mkdir flink-python/dist{code}
>  ## Move the unzipped wheel packages to the directory of 
> {{{}flink-python/dist{}}}:
> {code:java}
> $ mv /path/to/wheel_Darwin_build_wheels\ mac/* flink-python/dist/
> $ mv /path/to/wheel_Linux_build_wheels\ linux/* flink-python/dist/
> $ cd tools{code}
> Finally, we create the binary convenience release files:
> {code:bash}
> tools $ RELEASE_VERSION=$RELEASE_VERSION releasing/create_binary_release.sh
> {code}
> If you want to run this step in parallel on a remote machine you have to make 
> the release commit available there (for example by pushing to a repository). 
> *This is important: the commit inside the binary builds has to match the 
> commit of the source builds and the tagged release commit.* 
> When building remotely, you can skip gpg signing by setting 
> {{{}SKIP_GPG=true{}}}. You would then sign the files manually after 
> downloading them to your machine:
> {code:bash}
> $ for f in flink-*-bin*.tgz; do gpg --armor --detach-sig $f; done
> $ gpg --armor --detach-sig apache-flink-*.tar.gz
> {code}
> The release manager need to make sure the PyPI project {{apache-flink}} and 
> {{apache-flink-libraries}} has enough available space for the python 
> artifacts. The remaining space must be larger than the size of 
> {{{}tools/releasing/release/python{}}}. Login with the PyPI admin account 
> ([account 
> info|https://lists.apache.org/thread.html/8273a5e8834b788d8ae552a5e177b69e04e96c0446bb90979444deee@%3Cprivate.flink.apache.org%3E]
>  is only available to PMC members) and check the remaining space in [project 
> settings|http://pypi.org/manage/project/apache-flink-libraries/settings].
> Request an increase if there's not enough space. Note, it could take some 
> days for PyPI to review our request.
>  
> ----
> h3. Expectations
>  * Check hashes (e.g. shasum -c *.sha512)
>  * Check signatures (e.g. {{{}gpg --verify 
> flink-1.2.3-source-release.tar.gz.asc flink-1.2.3-source-release.tar.gz{}}})
>  * {{grep}} for legal headers in each file.
>  * If time allows check the NOTICE files of the modules whose dependencies 
> have been changed in this release in advance, since the license issues from 
> time to time pop up during voting. See [Verifying a Flink 
> Release|https://cwiki.apache.org/confluence/display/FLINK/Verifying+a+Flink+Release]
>  "Checking License" section.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to