mik-laj commented on a change in pull request #11310:
URL: https://github.com/apache/airflow/pull/11310#discussion_r501952278



##########
File path: dev/README.md
##########
@@ -211,364 +1091,295 @@ Checking 
apache_airflow-1.10.12rc4-py2.py3-none-any.whl.sha512
 Checking apache-airflow-1.10.12rc4-source.tar.gz.sha512
 ```
 
-# Verifying if the release candidate "works" by Contributors
+### Verify if the Backport Packages release candidates "work" by Contributors
 
 This can be done (and we encourage to) by any of the Contributors. In fact, 
it's best if the
 actual users of Apache Airflow test it in their own staging/test 
installations. Each release candidate
 is available on PyPI apart from SVN packages, so everyone should be able to 
install
 the release candidate version of Airflow via simply (<VERSION> is 1.10.12 for 
example, and <X> is
 release candidate number 1,2,3,....).
 
-```bash
-pip install apache-airflow==<VERSION>rc<X>
-```
-Optionally it can be followed with constraints
-
-```bash
-pip install apache-airflow==<VERSION>rc<X> \
-  --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-<VERSION>/constraints-3.6.txt"`
-```
+You can use any of the installation methods you prefer (you can even install 
it via the binary wheels
+downloaded from the SVN).
 
-Note that the constraints contain python version that you are installing it 
with.
 
-You can use any of the installation methods you prefer (you can even install 
it via the binary wheel
-downloaded from the SVN).
+#### Installing in your local virtualenv
 
-There is also an easy way of installation with Breeze if you have the latest 
sources of Apache Airflow.
-Running the following command will use tmux inside breeze, create `admin` user 
and run Webserver & Scheduler:
+You have to make sure you have Airilow 1.10.* installed in your PIP virtualenv
+(the version you want to install providers with).
 
-```
-./breeze start-airflow --install-airflow-version <VERSION>rc<X> --python 3.7 
--backend postgres
+```shell script
+pip install apache-airflow-backport-providers-<provider>==<VERSION>rc<X>
 ```
 
-Once you install and run Airflow, you should perform any verification you see 
as necessary to check
-that the Airflow works as you expected.
+#### Installing with Breeze
 
+There is also an easy way of installation with Breeze if you have the latest 
sources of Apache Airflow.
+Here is a typical scenario.
 
-# Building an RC
-
-The Release Candidate artifacts we vote upon should be the exact ones we vote 
against, without any modification than renaming – i.e. the contents of the 
files must be the same between voted release canidate and final release. 
Because of this the version in the built artifacts that will become the 
official Apache releases must not include the rcN suffix.
+First copy all the provider packages .whl files to the `dist` folder.
 
-- Set environment variables
+```shell script
+./breeze start-airflow --install-airflow-version <VERSION>rc<X> \
+    --python 3.7 --backend postgres --instal-wheels
 ```
-# Set Version
-export VERSION=1.10.2rc3
-
-
-# Set AIRFLOW_REPO_ROOT to the path of your git repo
-export AIRFLOW_REPO_ROOT=$(pwd)
 
+For 1.10 releases you can also use `--no-rbac-ui` flag disable RBAC UI of 
Airflow:
 
-# Example after cloning
-git clone https://github.com/apache/airflow.git airflow
-cd airflow
-export AIRFLOW_REPO_ROOT=$(pwd)
+```shell script
+./breeze start-airflow --install-airflow-version <VERSION>rc<X> \
+    --python 3.7 --backend postgres --install-wheels --no-rbac-ui
 ```
 
-- set your version to 1.10.2 in airflow/version.py (without the RC tag)
-- Commit the version change.
-
-- Tag your release
+#### Building your own docker image
 
-`git tag ${VERSION}`
+If you prefer to build your own image, You can also use the official image and 
PyPI packages to test
+backport packages. If you need to fully test the integration, sometimes you 
also have to install
+additional components. Below is an example Dockerfile, which installs backport 
providers for Google
 
-- Clean the checkout: the sdist step below will
-`git clean -fxd`
+```
+FROM apache/airflow:1.10.12
 
-- Tarball the repo
-`git archive --format=tar.gz ${VERSION} --prefix=apache-airflow-${VERSION}/ -o 
apache-airflow-${VERSION}-source.tar.gz`
+RUN pip install --user apache-airflow-backport-providers-google==2020.10.5.rc1
 
-- Generate sdist
-NOTE: Make sure your checkout is clean at this stage - any untracked or 
changed files will otherwise be included in the file produced.
-`python setup.py compile_assets sdist bdist_wheel`
+RUN curl https://sdk.cloud.google.com | bash \
+    && echo "source /home/airflow/google-cloud-sdk/path.bash.inc" >> 
/home/airflow/.bashrc \
+    && echo "source /home/airflow/google-cloud-sdk/completion.bash.inc" >> 
/home/airflow/.bashrc
 
-- Rename the sdist
-```
-mv dist/apache-airflow-${VERSION%rc?}.tar.gz 
apache-airflow-${VERSION}-bin.tar.gz
-mv dist/apache_airflow-${VERSION%rc?}-py2.py3-none-any.whl 
apache_airflow-${VERSION}-py2.py3-none-any.whl
-```
+USER 0
+RUN KUBECTL_VERSION="$(curl -s 
https://storage.googleapis.com/kubernetes-release/release/stable.txt)" \
+    && 
KUBECTL_URL="https://storage.googleapis.com/kubernetes-release/release/${KUBECTL_VERSION}/bin/linux/amd64/kubectl";
 \
+    && curl -L "${KUBECTL_URL}" --output /usr/local/bin/kubectl \
+    && chmod +x /usr/local/bin/kubectl
 
-- Generate SHA512/ASC (If you have not generated a key yet, generate it by 
following instructions on 
http://www.apache.org/dev/openpgp.html#key-gen-generate-key)
-```
-${AIRFLOW_REPO_ROOT}/dev/sign.sh apache-airflow-${VERSION}-source.tar.gz
-${AIRFLOW_REPO_ROOT}/dev/sign.sh apache-airflow-${VERSION}-bin.tar.gz
-${AIRFLOW_REPO_ROOT}/dev/sign.sh apache_airflow-${VERSION}-py2.py3-none-any.whl
+USER ${AIRFLOW_UID}
 ```
 
-- Push Tags
-`git push --tags`
+To build an image build and run a shell, run:
 
-- Push the artifacts to ASF dev dist repo
 ```
-# First clone the repo
-svn checkout https://dist.apache.org/repos/dist/dev/airflow airflow-dev
-
-# Create new folder for the release
-cd airflow-dev
-svn mkdir ${VERSION}
-
-# Move the artifacts to svn folder & commit
-mv ${AIRFLOW_REPO_ROOT}/apache{-,_}airflow-${VERSION}* ${VERSION}/
-cd ${VERSION}
-svn add *
-svn commit -m "Add artifacts for Airflow ${VERSION}"
+docker build . -t my-airflow
+docker run  -ti \
+    --rm \
+    -v "$PWD/data:/opt/airflow/" \
+    -v "$PWD/keys/:/keys/" \
+    -p 8080:8080 \
+    -e GOOGLE_APPLICATION_CREDENTIALS=/keys/sa.json \
+    -e AIRFLOW__CORE__LOAD_EXAMPLES=True \
+    my-airflow

Review comment:
       ```suggestion
       my-airflow bash
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to