Re: [PR] Implement the breeze tag_providers command [airflow]
amoghrajesh commented on code in PR #38447: URL: https://github.com/apache/airflow/pull/38447#discussion_r1537083067 ## dev/breeze/doc/09_release_management_tasks.rst: ## @@ -189,6 +189,26 @@ These are all of the available flags for the ``release-prod-images`` command: :width: 100% :alt: Breeze release management release prod images +Adding git tags for providers +""" Review Comment: There are a few quotes missing. You need to cover the heading with quotes ## dev/breeze/src/airflow_breeze/commands/release_management_commands.py: ## @@ -949,6 +950,70 @@ def run_generate_constraints_in_parallel( ) +@release_management.command( +name="tag-providers", +help="Generates tags for providers.", Review Comment: nit: Generates tags for airflow provider releases ## dev/breeze/src/airflow_breeze/commands/release_management_commands.py: ## @@ -949,6 +950,70 @@ def run_generate_constraints_in_parallel( ) +@release_management.command( +name="tag-providers", +help="Generates tags for providers.", +) +@option_dry_run +@option_verbose +def tag_providers(): +found_remote = None +remotes = ["origin", "apache"] +for remote in remotes: +try: +command = ["git", "remote", "get-url", "--push", shlex.quote(remote)] +result = subprocess.run(command, stdout=subprocess.PIPE, stderr=subprocess.DEVNULL, text=True) +if "apache/airflow.git" in result.stdout: +found_remote = remote +break +except subprocess.CalledProcessError: +pass + +if found_remote is None: +raise ValueError("Could not find remote configured to push to apache/airflow") + +tags = [] +for file in os.listdir(os.path.join(SOURCE_DIR_PATH, "dist")): +if file.endswith(".whl"): +match = re.match(r".*airflow_providers_(.*)-(.*)-py3.*", file) +if match: +provider = f"providers-{match.group(1).replace('_', '-')}" +tag = f"{provider}/{match.group(2)}" +try: +subprocess.run( +["git", "tag", shlex.quote(tag), "-m", f"Release {date.today()} of providers"], +check=True, +) +tags.append(tag) +except subprocess.CalledProcessError: +pass + +if tags and len(tags) > 0: +try: +push_command = ["git", "push", remote] + [shlex.quote(tag) for tag in tags] +push_result = subprocess.Popen( +push_command, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True +) +push_output, push_error = push_result.communicate() +if push_output: +print(push_output) +if push_error: +print(push_error) +print("Tags pushed successfully") Review Comment: Let us use get_console().print() instead with appropriate level of error? For example: For success, you can do this: `get_console().print("[success]{push_output}[/]")` For error: `get_console().print("[error]{push_error}[/]")` And similarly in other places as well ## dev/breeze/doc/09_release_management_tasks.rst: ## @@ -189,6 +189,26 @@ These are all of the available flags for the ``release-prod-images`` command: :width: 100% :alt: Breeze release management release prod images +Adding git tags for providers +""" + +Assume that your remote for apache repository is called apache you should now set tags for the providers in the repo. +Sometimes in cases when there is a connectivity issue to Github, it might be possible that local tags get created and lead to annoying errors. +The default behaviour would be to clean such local tags up. + +If you want to disable this behaviour, set the env CLEAN_LOCAL_TAGS to false. Review Comment: This text doesn't define the intention of the command, can you reword something like this: ``` This command can be utilized to manage git tags for providers within the airflow remote repository during provider releases. Sometimes in cases when there is a connectivity issue to Github, it might be possible that local tags get created and lead to annoying errors. The default behaviour would be to clean such local tags up. ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] changing dag_processing.processes from UpDownCounter to guage [airflow]
Bowrna commented on PR #38400: URL: https://github.com/apache/airflow/pull/38400#issuecomment-2017270573 when setting the `dag_processing.processes` as gauge instead of the counter, we may need to initialize the value. Gauge works in the following way: gaugor:333|g the above command sets the param `gaugor` to 333 gaugor:433|g the above command sets the param `gaugor` to 433 gaugor:-10|g the above command sets the param `gaugor` to 423 gaugor:+4|g the above command sets the param `gaugor` to 427 So to keep increasing the process count as it starts and decrease the process count as it stops(finishes or timeouts or error outs), we first have to initialize it to zero value at the beginning. I could set the gauge to zero at the start of the `airflow scheduler` command. https://airflow.apache.org/docs/apache-airflow/stable/administration-and-deployment/scheduler.html#running-more-than-one-scheduler But there is an option to run more than one scheduler. so setting the param as zero in the beginning of the `airflow scheduler` command wouldn't be the right way. How do you think this case can be better handled? @potiuk @dirrao @ferruzzi @hussein-awala -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Add legend for duration markline. [airflow]
dirrao commented on code in PR #38434: URL: https://github.com/apache/airflow/pull/38434#discussion_r1537079098 ## airflow/www/static/js/dag/details/dag/RunDurationChart.tsx: ## @@ -164,7 +164,43 @@ const RunDurationChart = ({ showLandingTimes }: Props) => { `; } + function formatMarkLineLegendName(name: string) { +switch (name) { + case "runDurationUnit": +return "Mean run duration"; Review Comment: Change it to median as per the @simond comment. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[PR] Implement the breeze tag_providers command [airflow]
poorvirohidekar opened a new pull request, #38447: URL: https://github.com/apache/airflow/pull/38447 This PR adds support to move the tag_providers.py script to breeze under release-management and addresses the review comments from the below mentioned PR. Reference: https://github.com/apache/airflow/pull/38278 --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Add legend for duration markline. [airflow]
tirkarthi commented on PR #38434: URL: https://github.com/apache/airflow/pull/38434#issuecomment-2017101047 @simond Thanks, I will update the legend. Please let me know if median total duration is more useful than median run duration. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Fix image cache optimizations - speeding up the build [airflow]
potiuk commented on code in PR #38442: URL: https://github.com/apache/airflow/pull/38442#discussion_r1536985669 ## .github/workflows/ci-image-build.yml: ## @@ -195,19 +194,19 @@ ${{ inputs.do-build == 'true' && inputs.image-tag || '' }}" run: > breeze ci-image build --tag-as-latest --image-tag "${{ inputs.image-tag }}" --python "${{ matrix.python-version }}" - --platform "linux/${{ inputs.platform }}" + --platform "${{ inputs.platform }}" env: DOCKER_CACHE: ${{ inputs.docker-cache }} INSTALL_MYSQL_CLIENT_TYPE: ${{ inputs.install-mysql-client-type }} UPGRADE_TO_NEWER_DEPENDENCIES: ${{ inputs.upgrade-to-newer-dependencies }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - BUILDER: ${{ inputs.platform == 'amd64' && 'default' || 'airflow_cache' }} + BUILDER: "airflow_cache" Review Comment: That was the clue. BUILDER should be always set to the same `airflow_cache` and then the cache will be nicely reused and rebuilt. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch fix-image-cache updated (579cd39821 -> e4648ad9a1)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch fix-image-cache in repository https://gitbox.apache.org/repos/asf/airflow.git omit 579cd39821 Fix image cache optimizations - speeding up the build add 3832ead221 Fix failing MinSQLAlchemy test after moving sqlalchemy spec add e4648ad9a1 Fix image cache optimizations - speeding up the build This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (579cd39821) \ N -- N -- N refs/heads/fix-image-cache (e4648ad9a1) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: .github/workflows/run-unit-tests.yml | 2 +- Dockerfile.ci| 12 ++-- scripts/docker/entrypoint_ci.sh | 12 ++-- 3 files changed, 13 insertions(+), 13 deletions(-)
Re: [PR] Allow users to write dag_id and task_id in their national characters, added display name for dag / task (v2) [airflow]
potiuk commented on PR #38446: URL: https://github.com/apache/airflow/pull/38446#issuecomment-2017069297 The failing MINSQL ALchemy test here is fixed in #38445 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] allows users to write dag_id and task_id in their national characters, added display name for dag / task [airflow]
jscheffl commented on PR #32520: URL: https://github.com/apache/airflow/pull/32520#issuecomment-2017051703 As linked in #38446 - I tried to spend time on the weekend to continue working on this PR... let's see if we can get this reviewed and completed... -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Fix failing MinSQLAlchemy test after moving sqlalchemy spec [airflow]
potiuk commented on PR #38444: URL: https://github.com/apache/airflow/pull/38444#issuecomment-2017045033 Closed in favour of canary test in #38445 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[PR] Fix 22073 [airflow]
jscheffl opened a new pull request, #38446: URL: https://github.com/apache/airflow/pull/38446 This PR tries to continue the work from @xgao1023 in PR #35320. It merges with current main. On top of this I fixed the bugs in pytest, some glitches in UI. I'd LOVE to get this into 2.9.0 and as there were a couple of reviews already on the previous PR I hope it is not as compex as it looks. A lot of code is also due to the fact that I try to migrate all examples over to "nicer names". Sneak preview how it could look like after this PR: ![image](https://github.com/apache/airflow/assets/95105677/071155eb-a4c2-4a91-be26-84ce9b7c7d4f) closes: #22073 related: #32520, #28183 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[PR] Fix failing MinSQLAlchemy test after moving sqlalchemy spec [airflow]
potiuk opened a new pull request, #38445: URL: https://github.com/apache/airflow/pull/38445 Ths MIN SQLACLHEMY test was based on retrieving sqlalchemy min version from pyproject.toml but since we moved it to hatch_build.py we should read it from there. --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch fix-failing-sqlalchemy-test created (now 3832ead221)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch fix-failing-sqlalchemy-test in repository https://gitbox.apache.org/repos/asf/airflow.git at 3832ead221 Fix failing MinSQLAlchemy test after moving sqlalchemy spec This branch includes the following new commits: new 3832ead221 Fix failing MinSQLAlchemy test after moving sqlalchemy spec The 1 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference.
Re: [PR] Fix failing MinSQLAlchemy test after moving sqlalchemy spec [airflow]
potiuk closed pull request #38444: Fix failing MinSQLAlchemy test after moving sqlalchemy spec URL: https://github.com/apache/airflow/pull/38444 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) 01/01: Fix failing MinSQLAlchemy test after moving sqlalchemy spec
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a commit to branch fix-failing-sqlalchemy-test in repository https://gitbox.apache.org/repos/asf/airflow.git commit 3832ead22158ed963417367082868f5563f8f734 Author: Jarek Potiuk AuthorDate: Mon Mar 25 02:11:39 2024 +0100 Fix failing MinSQLAlchemy test after moving sqlalchemy spec Ths MIN SQLACLHEMY test was based on retrieving sqlalchemy min version from pyproject.toml but since we moved it to hatch_build.py we should read it from there. --- .github/workflows/run-unit-tests.yml | 2 +- Dockerfile.ci| 12 ++-- scripts/docker/entrypoint_ci.sh | 12 ++-- 3 files changed, 13 insertions(+), 13 deletions(-) diff --git a/.github/workflows/run-unit-tests.yml b/.github/workflows/run-unit-tests.yml index 4f72ff3c01..ef7f86b481 100644 --- a/.github/workflows/run-unit-tests.yml +++ b/.github/workflows/run-unit-tests.yml @@ -126,7 +126,7 @@ jobs: BACKEND_VERSION: "${{ matrix.backend-version }}" DEBUG_RESOURCES: "${{ inputs.debug-resources }}" DOWNGRADE_SQLALCHEMY: "${{ inputs.downgrade-sqlalchemy }}" - DOWN_PENDULUM: "${{ inputs.downgrade-pendulum }}" + DOWNGRADE_PENDULUM: "${{ inputs.downgrade-pendulum }}" ENABLE_COVERAGE: "${{ inputs.run-coverage }}" IMAGE_TAG: "${{ inputs.image-tag }}" INCLUDE_SUCCESS_OUTPUTS: ${{ inputs.include-success-outputs }} diff --git a/Dockerfile.ci b/Dockerfile.ci index 03c5bbb4e7..3c0e0f2932 100644 --- a/Dockerfile.ci +++ b/Dockerfile.ci @@ -1049,11 +1049,11 @@ function check_pydantic() { } -function check_download_sqlalchemy() { +function check_downgrade_sqlalchemy() { if [[ ${DOWNGRADE_SQLALCHEMY=} != "true" ]]; then return fi -min_sqlalchemy_version=$(grep "\"sqlalchemy>=" pyproject.toml | sed "s/.*>=\([0-9\.]*\).*/\1/" | xargs) +min_sqlalchemy_version=$(grep "\"sqlalchemy>=" hatch_build.py | sed "s/.*>=\([0-9\.]*\).*/\1/" | xargs) echo echo "${COLOR_BLUE}Downgrading sqlalchemy to minimum supported version: ${min_sqlalchemy_version}${COLOR_RESET}" echo @@ -1062,11 +1062,11 @@ function check_download_sqlalchemy() { pip check } -function check_download_pendulum() { +function check_downgrade_pendulum() { if [[ ${DOWNGRADE_PENDULUM=} != "true" ]]; then return fi -min_pendulum_version=$(grep "\"pendulum>=" pyproject.toml | sed "s/.*>=\([0-9\.]*\).*/\1/" | xargs) +min_pendulum_version=$(grep "\"pendulum>=" hatch_build.py | sed "s/.*>=\([0-9\.]*\).*/\1/" | xargs) echo echo "${COLOR_BLUE}Downgrading pendulum to minimum supported version: ${min_pendulum_version}${COLOR_RESET}" echo @@ -1104,8 +1104,8 @@ determine_airflow_to_use environment_initialization check_boto_upgrade check_pydantic -check_download_sqlalchemy -check_download_pendulum +check_downgrade_sqlalchemy +check_downgrade_pendulum check_run_tests "${@}" exec /bin/bash "${@}" diff --git a/scripts/docker/entrypoint_ci.sh b/scripts/docker/entrypoint_ci.sh index 2792035d7f..8b2ec5c09f 100755 --- a/scripts/docker/entrypoint_ci.sh +++ b/scripts/docker/entrypoint_ci.sh @@ -271,11 +271,11 @@ function check_pydantic() { # Download minimum supported version of sqlalchemy to run tests with it -function check_download_sqlalchemy() { +function check_downgrade_sqlalchemy() { if [[ ${DOWNGRADE_SQLALCHEMY=} != "true" ]]; then return fi -min_sqlalchemy_version=$(grep "\"sqlalchemy>=" pyproject.toml | sed "s/.*>=\([0-9\.]*\).*/\1/" | xargs) +min_sqlalchemy_version=$(grep "\"sqlalchemy>=" hatch_build.py | sed "s/.*>=\([0-9\.]*\).*/\1/" | xargs) echo echo "${COLOR_BLUE}Downgrading sqlalchemy to minimum supported version: ${min_sqlalchemy_version}${COLOR_RESET}" echo @@ -285,11 +285,11 @@ function check_download_sqlalchemy() { } # Download minimum supported version of pendulum to run tests with it -function check_download_pendulum() { +function check_downgrade_pendulum() { if [[ ${DOWNGRADE_PENDULUM=} != "true" ]]; then return fi -min_pendulum_version=$(grep "\"pendulum>=" pyproject.toml | sed "s/.*>=\([0-9\.]*\).*/\1/" | xargs) +min_pendulum_version=$(grep "\"pendulum>=" hatch_build.py | sed "s/.*>=\([0-9\.]*\).*/\1/" | xargs) echo echo "${COLOR_BLUE}Downgrading pendulum to minimum supported version: ${min_pendulum_version}${COLOR_RESET}" echo @@ -328,8 +328,8 @@ determine_airflow_to_use environment_initialization check_boto_upgrade check_pydantic -check_download_sqlalchemy -check_download_pendulum +check_downgrade_sqlalchemy +check_downgrade_pendulum check_run_tests "${@}" # If we are not running tests - just exec to bash shell
Re: [PR] Fix failing MinSQLAlchemy test after moving sqlalchemy spec [airflow]
potiuk commented on code in PR #38444: URL: https://github.com/apache/airflow/pull/38444#discussion_r1536962905 ## .github/workflows/run-unit-tests.yml: ## @@ -126,7 +126,7 @@ jobs: BACKEND_VERSION: "${{ matrix.backend-version }}" DEBUG_RESOURCES: "${{ inputs.debug-resources }}" DOWNGRADE_SQLALCHEMY: "${{ inputs.downgrade-sqlalchemy }}" - DOWN_PENDULUM: "${{ inputs.downgrade-pendulum }}" + DOWNGRADE_PENDULUM: "${{ inputs.downgrade-pendulum }}" Review Comment: also noticed a typo here :) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[PR] Fix failing MinSQLAlchemy test after moving sqlalchemy spec [airflow]
potiuk opened a new pull request, #38444: URL: https://github.com/apache/airflow/pull/38444 Ths MIN SQLACLHEMY test was based on retrieving sqlalchemy min version from pyproject.toml but since we moved it to hatch_build.py we should read it from there. --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Fix image cache optimizations - speeding up the build [airflow]
potiuk commented on code in PR #38442: URL: https://github.com/apache/airflow/pull/38442#discussion_r1536951037 ## .github/workflows/finalize-tests.yml: ## @@ -120,31 +120,48 @@ jobs: run: git push - # Push BuildX cache to GitHub Registry in Apache repository, if all tests are successful and build - # is executed as result of direct push to "main" or one of the "vX-Y-test" branches - # It rebuilds all images using just-pushed constraints using buildx and pushes them to registry - # It will automatically check if a new python image was released and will pull the latest one if needed - # push-buildx-cache-to-github-registry: - #name: Push Regular Image Cache - #needs: [update-constraints] - #uses: ./.github/workflows/push-image-cache.yml - #permissions: - # contents: read - # packages: write - #secrets: inherit - #with: - # runs-on: ${{ inputs.runs-on }} - # cache-type: "Regular" - # include-prod-images: "true" - # push-latest-images: "true" - # use-uv: "true" - # image-tag: ${{ inputs.image-tag }} - # python-versions: ${{ inputs.python-versions }} - # branch: ${{ inputs.branch }} - # constraints-branch: ${{ inputs.constraints-branch }} - # include-success-outputs: ${{ inputs.include-success-outputs }} - # docker-cache: ${{ inputs.docker-cache }} - #if: inputs.canary-run == 'true' + push-buildx-cache-to-github-registry-amd: Review Comment: We have now "final" cache built for both AMD and ARM but in two separate workflows. This way AMD can be run using public runners but ARM will use self-hosted ones/ -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Fix image cache optimizations - speeding up the build [airflow]
potiuk commented on code in PR #38442: URL: https://github.com/apache/airflow/pull/38442#discussion_r1536950612 ## .github/workflows/additional-ci-image-checks.yml: ## @@ -89,42 +89,40 @@ jobs: # delay cache refresh. It does not attempt to upgrade to newer dependencies. # We only push CI cache as PROD cache usually does not gain as much from fresh cache because # it uses prepared airflow and provider packages that invalidate the cache anyway most of the time - # push-early-buildx-cache-to-github-registry: - # name: Push Early Image Cache - # uses: ./.github/workflows/push-image-cache.yml - # permissions: - # contents: read - # # This write is only given here for `push` events from "apache/airflow" repo. It is not given for PRs - # # from forks. This is to prevent malicious PRs from creating images in the "apache/airflow" repo. - # # For regular build for PRS this "build-prod-images" workflow will be skipped anyway by the - # # "in-workflow-build" condition - # packages: write - # secrets: inherit - # with: - # runs-on: ${{ inputs.runs-on }} - # cache-type: "Early" - # include-prod-images: "false" - # push-latest-images: "false" - # image-tag: ${{ inputs.image-tag }} - # python-versions: ${{ inputs.python-versions }} - # branch: ${{ inputs.branch }} - # use-uv: "true" - # include-success-outputs: ${{ inputs.include-success-outputs }} - # constraints-branch: ${{ inputs.constraints-branch }} - # docker-cache: ${{ inputs.docker-cache }} - # if: inputs.canary-run == 'true' && inputs.branch == 'main' + push-early-buildx-cache-to-github-registry: +name: Push Early Image Cache +uses: ./.github/workflows/push-image-cache.yml +permissions: + contents: read + # This write is only given here for `push` events from "apache/airflow" repo. It is not given for PRs + # from forks. This is to prevent malicious PRs from creating images in the "apache/airflow" repo. + # For regular build for PRS this "build-prod-images" workflow will be skipped anyway by the + # "in-workflow-build" condition + packages: write +secrets: inherit +with: + # Runs on Public runners + cache-type: "Early" + include-prod-images: "false" + push-latest-images: "false" + platform: "linux/amd64" + python-versions: ${{ inputs.python-versions }} + branch: ${{ inputs.branch }} + constraints-branch: ${{ inputs.constraints-branch }} + use-uv: "true" + include-success-outputs: ${{ inputs.include-success-outputs }} + docker-cache: ${{ inputs.docker-cache }} +if: inputs.canary-run == 'true' && inputs.branch == 'main' # Check that after earlier cache push, breeze command will build quickly check-that-image-builds-quickly: -timeout-minutes: 5 +timeout-minutes: 11 Review Comment: This job had not been tesitng what it was supposed to be testing. Having IMAGE_TAG, it was just pulling the image rarther than building it using cache and timeout of 2 minutes was far too low for building it. We might tweak it in the future as well when the cache gets stabilized. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Fix image cache optimizations - speeding up the build [airflow]
potiuk commented on code in PR #38442: URL: https://github.com/apache/airflow/pull/38442#discussion_r1536950198 ## .github/workflows/additional-ci-image-checks.yml: ## @@ -89,42 +89,40 @@ jobs: # delay cache refresh. It does not attempt to upgrade to newer dependencies. # We only push CI cache as PROD cache usually does not gain as much from fresh cache because # it uses prepared airflow and provider packages that invalidate the cache anyway most of the time - # push-early-buildx-cache-to-github-registry: - # name: Push Early Image Cache - # uses: ./.github/workflows/push-image-cache.yml - # permissions: - # contents: read - # # This write is only given here for `push` events from "apache/airflow" repo. It is not given for PRs - # # from forks. This is to prevent malicious PRs from creating images in the "apache/airflow" repo. - # # For regular build for PRS this "build-prod-images" workflow will be skipped anyway by the - # # "in-workflow-build" condition - # packages: write - # secrets: inherit - # with: - # runs-on: ${{ inputs.runs-on }} - # cache-type: "Early" - # include-prod-images: "false" - # push-latest-images: "false" - # image-tag: ${{ inputs.image-tag }} - # python-versions: ${{ inputs.python-versions }} - # branch: ${{ inputs.branch }} - # use-uv: "true" - # include-success-outputs: ${{ inputs.include-success-outputs }} - # constraints-branch: ${{ inputs.constraints-branch }} - # docker-cache: ${{ inputs.docker-cache }} - # if: inputs.canary-run == 'true' && inputs.branch == 'main' + push-early-buildx-cache-to-github-registry: +name: Push Early Image Cache +uses: ./.github/workflows/push-image-cache.yml +permissions: + contents: read + # This write is only given here for `push` events from "apache/airflow" repo. It is not given for PRs + # from forks. This is to prevent malicious PRs from creating images in the "apache/airflow" repo. + # For regular build for PRS this "build-prod-images" workflow will be skipped anyway by the + # "in-workflow-build" condition + packages: write +secrets: inherit +with: + # Runs on Public runners Review Comment: We build the AMD images using public runners, to save "self-hosted" ones for heavier jobs. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Fix image cache optimizations - speeding up the build [airflow]
potiuk commented on code in PR #38442: URL: https://github.com/apache/airflow/pull/38442#discussion_r1536950005 ## .github/workflows/additional-ci-image-checks.yml: ## @@ -89,42 +89,40 @@ jobs: # delay cache refresh. It does not attempt to upgrade to newer dependencies. # We only push CI cache as PROD cache usually does not gain as much from fresh cache because # it uses prepared airflow and provider packages that invalidate the cache anyway most of the time - # push-early-buildx-cache-to-github-registry: - # name: Push Early Image Cache - # uses: ./.github/workflows/push-image-cache.yml - # permissions: - # contents: read - # # This write is only given here for `push` events from "apache/airflow" repo. It is not given for PRs - # # from forks. This is to prevent malicious PRs from creating images in the "apache/airflow" repo. - # # For regular build for PRS this "build-prod-images" workflow will be skipped anyway by the - # # "in-workflow-build" condition - # packages: write - # secrets: inherit - # with: - # runs-on: ${{ inputs.runs-on }} - # cache-type: "Early" - # include-prod-images: "false" - # push-latest-images: "false" - # image-tag: ${{ inputs.image-tag }} - # python-versions: ${{ inputs.python-versions }} - # branch: ${{ inputs.branch }} - # use-uv: "true" - # include-success-outputs: ${{ inputs.include-success-outputs }} - # constraints-branch: ${{ inputs.constraints-branch }} - # docker-cache: ${{ inputs.docker-cache }} - # if: inputs.canary-run == 'true' && inputs.branch == 'main' + push-early-buildx-cache-to-github-registry: Review Comment: Re-enable pushing early image cache. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Fix image cache optimizations - speeding up the build [airflow]
potiuk commented on code in PR #38442: URL: https://github.com/apache/airflow/pull/38442#discussion_r1536949904 ## dev/breeze/src/airflow_breeze/utils/docker_command_utils.py: ## @@ -381,12 +381,15 @@ def prepare_base_build_command(image_params: CommonBuildParams) -> list[str]: ] ) if not image_params.docker_host: +builder = get_and_use_docker_context(image_params.builder) build_command_param.extend( [ "--builder", -get_and_use_docker_context(image_params.builder), +builder, ] ) +if builder != "default": Review Comment: When we are using builder different than default we use `--load` flag to load the built image to local docker engine. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Fix image cache optimizations - speeding up the build [airflow]
potiuk commented on code in PR #38442: URL: https://github.com/apache/airflow/pull/38442#discussion_r1536949488 ## dev/breeze/src/airflow_breeze/utils/docker_command_utils.py: ## @@ -658,6 +661,7 @@ def autodetect_docker_context(): def get_and_use_docker_context(context: str): if context == "autodetect": context = autodetect_docker_context() +run_command(["docker", "context", "create", context], check=False) Review Comment: Creates missing context if needed. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Fix image cache optimizations - speeding up the build [airflow]
potiuk commented on code in PR #38442: URL: https://github.com/apache/airflow/pull/38442#discussion_r1536949451 ## dev/breeze/src/airflow_breeze/utils/image.py: ## @@ -197,18 +197,8 @@ def tag_image_as_latest(image_params: CommonBuildParams, output: Output | None) check=False, ) if command.returncode != 0: -return command -if image_params.push: -command = run_command( -[ -"docker", -"push", -image_params.airflow_image_name + ":latest", -], -output=output, -capture_output=True, -check=False, -) +get_console(output=output).print(command.stdout) Review Comment: Adds diagnostics in case `docker tag fails` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch fix-image-cache updated (64494f3c09 -> 579cd39821)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch fix-image-cache in repository https://gitbox.apache.org/repos/asf/airflow.git discard 64494f3c09 Fix image cache optimizations - speeding up the build add 579cd39821 Fix image cache optimizations - speeding up the build This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (64494f3c09) \ N -- N -- N refs/heads/fix-image-cache (579cd39821) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: .github/workflows/additional-ci-image-checks.yml | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-)
Re: [PR] Helm: Add namespace to all namespace-scoped resources [airflow]
github-actions[bot] commented on PR #33177: URL: https://github.com/apache/airflow/pull/33177#issuecomment-2017004416 This pull request has been automatically marked as stale because it has not had recent activity. It will be closed in 5 days if no further activity occurs. Thank you for your contributions. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch fix-image-cache updated (81467fa5e8 -> 64494f3c09)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch fix-image-cache in repository https://gitbox.apache.org/repos/asf/airflow.git omit 81467fa5e8 Fix image cache optimizations - speeding up the build add 64494f3c09 Fix image cache optimizations - speeding up the build This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (81467fa5e8) \ N -- N -- N refs/heads/fix-image-cache (64494f3c09) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: .github/workflows/additional-ci-image-checks.yml | 1 - .github/workflows/finalize-tests.yml | 2 -- .github/workflows/push-image-cache.yml | 10 ++ dev/breeze/src/airflow_breeze/utils/image.py | 12 4 files changed, 2 insertions(+), 23 deletions(-)
(airflow) branch fix-image-cache updated (35763472bf -> 81467fa5e8)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch fix-image-cache in repository https://gitbox.apache.org/repos/asf/airflow.git discard 35763472bf Fix image cache optimizations - speeding up the build add 81467fa5e8 Fix image cache optimizations - speeding up the build This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (35763472bf) \ N -- N -- N refs/heads/fix-image-cache (81467fa5e8) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: dev/breeze/src/airflow_breeze/utils/docker_command_utils.py | 5 - 1 file changed, 4 insertions(+), 1 deletion(-)
(airflow) branch fix-image-cache updated (3eb5212b71 -> 35763472bf)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch fix-image-cache in repository https://gitbox.apache.org/repos/asf/airflow.git discard 3eb5212b71 Fix image cache optimizations - speeding up the build add 35763472bf Fix image cache optimizations - speeding up the build This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (3eb5212b71) \ N -- N -- N refs/heads/fix-image-cache (35763472bf) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: dev/breeze/src/airflow_breeze/utils/image.py | 2 ++ 1 file changed, 2 insertions(+)
(airflow) branch fix-image-cache updated (44a256052d -> 3eb5212b71)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch fix-image-cache in repository https://gitbox.apache.org/repos/asf/airflow.git discard 44a256052d Fix image cache optimizations - speeding up the build add 20cb9f1770 Upgrade to newer build dependencies including the right Python version (#38443) add 3eb5212b71 Fix image cache optimizations - speeding up the build This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (44a256052d) \ N -- N -- N refs/heads/fix-image-cache (3eb5212b71) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: .github/workflows/basic-tests.yml | 8 +++- .../workflows/static-checks-mypy-and-constraints-generation.yml | 6 ++ pyproject.toml| 1 + scripts/ci/pre_commit/pre_commit_update_build_dependencies.py | 7 ++- 4 files changed, 20 insertions(+), 2 deletions(-)
(airflow) branch main updated: Upgrade to newer build dependencies including the right Python version (#38443)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a commit to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/main by this push: new 20cb9f1770 Upgrade to newer build dependencies including the right Python version (#38443) 20cb9f1770 is described below commit 20cb9f1770c819f31e48748fc9fb86527f739d6e Author: Jarek Potiuk AuthorDate: Sun Mar 24 23:50:59 2024 +0100 Upgrade to newer build dependencies including the right Python version (#38443) This change upgrades build dependencies, including the fact that for earlier Python versions, the set of dependencies might be different than than for later ones. This PR assumes that default python version (3.8) is installed in the environment and on path and will refuxe to run the pre-commit if it is not. --- .github/workflows/basic-tests.yml | 8 +++- .../workflows/static-checks-mypy-and-constraints-generation.yml | 6 ++ pyproject.toml| 1 + scripts/ci/pre_commit/pre_commit_update_build_dependencies.py | 7 ++- 4 files changed, 20 insertions(+), 2 deletions(-) diff --git a/.github/workflows/basic-tests.yml b/.github/workflows/basic-tests.yml index faf0d3963d..de70535a8d 100644 --- a/.github/workflows/basic-tests.yml +++ b/.github/workflows/basic-tests.yml @@ -207,12 +207,18 @@ jobs: persist-credentials: false - name: Cleanup docker uses: ./.github/actions/cleanup-docker + - name: "Setup python" +uses: actions/setup-python@v5 +with: + python-version: ${{ inputs.default-python-version }} + cache: 'pip' + cache-dependency-path: ./dev/breeze/pyproject.toml - name: "Setup python" uses: actions/setup-python@v5 with: python-version: "${{ inputs.default-python-version }}" cache: 'pip' - cache-dependency-path: ./dev/breeze/setup* + cache-dependency-path: ./dev/breeze/pyproject.toml - name: "Install Breeze" uses: ./.github/actions/breeze id: breeze diff --git a/.github/workflows/static-checks-mypy-and-constraints-generation.yml b/.github/workflows/static-checks-mypy-and-constraints-generation.yml index d083183516..3130c23a25 100644 --- a/.github/workflows/static-checks-mypy-and-constraints-generation.yml +++ b/.github/workflows/static-checks-mypy-and-constraints-generation.yml @@ -162,6 +162,12 @@ jobs: uses: actions/checkout@v4 with: persist-credentials: false + - name: "Setup python" +uses: actions/setup-python@v5 +with: + python-version: ${{ inputs.default-python-version }} + cache: 'pip' + cache-dependency-path: ./dev/breeze/pyproject.toml - name: Cleanup docker uses: ./.github/actions/cleanup-docker - name: "Prepare breeze & CI image: ${{ inputs.default-python-version}}:${{ inputs.image-tag }}" diff --git a/pyproject.toml b/pyproject.toml index 55f9592ecc..77b7f9e2ae 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -29,6 +29,7 @@ requires = [ "pathspec==0.12.1", "pluggy==1.4.0", "smmap==5.0.1", +"tomli==2.0.1; python_version < '3.11'", "trove-classifiers==2024.3.3", ] build-backend = "hatchling.build" diff --git a/scripts/ci/pre_commit/pre_commit_update_build_dependencies.py b/scripts/ci/pre_commit/pre_commit_update_build_dependencies.py index af0916bbd3..e32a8b2bb5 100755 --- a/scripts/ci/pre_commit/pre_commit_update_build_dependencies.py +++ b/scripts/ci/pre_commit/pre_commit_update_build_dependencies.py @@ -37,11 +37,16 @@ FILES_TO_REPLACE_HATCHLING_IN = [ files_changed = False + if __name__ == "__main__": +python38_bin = shutil.which("python3.8") +if not python38_bin: +print("Python 3.8 is required to run this script.") +sys.exit(1) temp_dir = Path(tempfile.mkdtemp()) hatchling_spec = "" try: -subprocess.check_call([sys.executable, "-m", "venv", temp_dir.as_posix()]) +subprocess.check_call([python38_bin, "-m", "venv", temp_dir.as_posix()]) venv_python = temp_dir / "bin" / "python" subprocess.check_call([venv_python, "-m", "pip", "install", "gitpython", "hatchling"]) frozen_deps = subprocess.check_output([venv_python, "-m", "pip", "freeze"], text=True)
Re: [PR] Upgrade to newer build dependencies including the right Python version [airflow]
potiuk merged PR #38443: URL: https://github.com/apache/airflow/pull/38443 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Add legend for duration markline. [airflow]
simond commented on PR #38434: URL: https://github.com/apache/airflow/pull/38434#issuecomment-2016947663 Just looking at your screenshots - you may want to change the wording to 'median total', 'median run' etc rather than 'mean' - median and mean are different things. Looks like you're using the correct one in your calculations though (median). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[PR] Upgrade to newer build dependencies including the right Python version [airflow]
potiuk opened a new pull request, #38443: URL: https://github.com/apache/airflow/pull/38443 This change upgrades build dependencies, including the fact that for earlier Python versions, the set of dependencies might be different than than for later ones. This PR assumes that default python version (3.8) is installed in the environment and on path and will refuxe to run the pre-commit if it is not. --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Test cache refreshing in CI [JUST TESTING] [airflow]
potiuk closed pull request #38427: Test cache refreshing in CI [JUST TESTING] URL: https://github.com/apache/airflow/pull/38427 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch fix-image-cache updated (fe69e7423a -> 44a256052d)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch fix-image-cache in repository https://gitbox.apache.org/repos/asf/airflow.git discard fe69e7423a Fix image cache optimizations - speeding up the build add 25e5d54192 Print selective-check traceback on stdout rather than stderr (#38441) add c88370305d Turn optional-dependencies in pyproject.toml into dynamic property (#38437) add 44a256052d Fix image cache optimizations - speeding up the build This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (fe69e7423a) \ N -- N -- N refs/heads/fix-image-cache (44a256052d) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: .dockerignore |1 - .pre-commit-config.yaml| 18 +- Dockerfile | 11 +- Dockerfile.ci | 12 +- INSTALL| 140 ++- airflow_pre_installed_providers.txt|2 +- clients/python/pyproject.toml |2 +- contributing-docs/07_local_virtualenv.rst | 51 +- contributing-docs/08_static_code_checks.rst|4 +- .../12_airflow_dependencies_and_extras.rst | 87 +- dev/breeze/README.md |2 +- dev/breeze/doc/02_customizing.rst |2 +- dev/breeze/doc/ci/04_selective_checks.md | 139 +-- dev/breeze/doc/images/output_static-checks.svg | 140 +-- dev/breeze/doc/images/output_static-checks.txt |2 +- dev/breeze/pyproject.toml |2 +- .../src/airflow_breeze/commands/ci_commands.py | 44 +- dev/breeze/src/airflow_breeze/pre_commit_ids.py|2 +- dev/breeze/src/airflow_breeze/utils/packages.py|3 + .../src/airflow_breeze/utils/selective_checks.py | 54 +- dev/breeze/tests/test_selective_checks.py | 49 +- docker_tests/requirements.txt |1 + docker_tests/test_prod_image.py| 10 +- docs/apache-airflow/extra-packages-ref.rst |2 +- .../installation/installing-from-sources.rst |6 + hatch_build.py | 716 +++- pyproject.toml | 1217 ++-- scripts/ci/pre_commit/common_precommit_utils.py| 27 +- .../pre_commit_check_extra_packages_ref.py | 19 +- ...ls.py => pre_commit_check_order_hatch_build.py} | 42 +- .../pre_commit_check_order_pyproject_toml.py | 104 -- scripts/ci/pre_commit/pre_commit_insert_extras.py | 86 +- .../pre_commit_sort_installed_providers.py |2 - .../pre_commit_update_build_dependencies.py| 29 +- .../pre_commit_update_providers_dependencies.py| 164 --- ...install_airflow_dependencies_from_branch_tip.sh | 11 +- 36 files changed, 1327 insertions(+), 1876 deletions(-) copy scripts/ci/pre_commit/{common_precommit_black_utils.py => pre_commit_check_order_hatch_build.py} (54%) mode change 100644 => 100755 delete mode 100755 scripts/ci/pre_commit/pre_commit_check_order_pyproject_toml.py
Re: [PR] Turn optional-dependencies in pyproject.toml into dynamic property [airflow]
potiuk merged PR #38437: URL: https://github.com/apache/airflow/pull/38437 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch fix-image-cache updated (89a3c70b96 -> fe69e7423a)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch fix-image-cache in repository https://gitbox.apache.org/repos/asf/airflow.git discard 89a3c70b96 Fix image cache optimizations - speeding up the build add fe69e7423a Fix image cache optimizations - speeding up the build This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (89a3c70b96) \ N -- N -- N refs/heads/fix-image-cache (fe69e7423a) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: .github/workflows/build-images.yml | 4 ++-- .github/workflows/ci.yml | 2 ++ 2 files changed, 4 insertions(+), 2 deletions(-)
Re: [I] Wrong DAG duration time [airflow]
arley-wilches commented on issue #30877: URL: https://github.com/apache/airflow/issues/30877#issuecomment-2016938365 Hi there. I found a similar issue, the duration time does not take into account mapped tasks and I have two tasks one of them took about 3 hours and the other one took a similar time but I rerun a couple of times one of their tasks, but the bar is https://github.com/apache/airflow/assets/67939534/669eb5ae-84d6-4f4e-bcdd-90e6b6ebc39c;> https://github.com/apache/airflow/assets/67939534/2a75ef6f-83e1-4e30-a40f-16ea1c60bbe3;> https://github.com/apache/airflow/assets/67939534/ce07ed00-e8a4-443e-b80b-ec428153ffe2;> incorrect. the other issue is, the total time was about 3 hours, the duration say 23 hours, when I rerun the task, that total time was about 3 hours but it showed 12 min. I am using: Version: [v2.8.1](https://pypi.python.org/pypi/apache-airflow/2.8.1) Git Version: .release:c0ffa9c5d96625c68ded9562632674ed366b5eb3 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[PR] Fix image cache optimizations - speeding up the build [airflow]
potiuk opened a new pull request, #38442: URL: https://github.com/apache/airflow/pull/38442 The recent refactors in workflows broke the way how cache had been used in the CI builds. This PR brings back the optimizations by using the cache and rebuilding it. --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch fix-image-cache created (now 89a3c70b96)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch fix-image-cache in repository https://gitbox.apache.org/repos/asf/airflow.git at 89a3c70b96 Fix image cache optimizations - speeding up the build This branch includes the following new commits: new 89a3c70b96 Fix image cache optimizations - speeding up the build The 1 revisions listed above as "new" are entirely new to this repository and will be described in separate emails. The revisions listed as "add" were already present in the repository and have only been added to this reference.
(airflow) 01/01: Fix image cache optimizations - speeding up the build
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a commit to branch fix-image-cache in repository https://gitbox.apache.org/repos/asf/airflow.git commit 89a3c70b9654c7f8812e87d49bc4dfec6979486d Author: Jarek Potiuk AuthorDate: Sun Mar 24 21:21:03 2024 +0100 Fix image cache optimizations - speeding up the build The recent refactors in workflows broke the way how cache had been used in the CI builds. This PR brings back the optimizations by using the cache and rebuilding it. --- .github/workflows/additional-ci-image-checks.yml | 54 - .github/workflows/build-images.yml | 3 +- .github/workflows/ci-image-build.yml | 17 +++--- .github/workflows/finalize-tests.yml | 69 ++ .github/workflows/prod-image-build.yml | 9 ++- .github/workflows/prod-image-extra-checks.yml | 3 + .github/workflows/push-image-cache.yml | 62 ++- .../airflow_breeze/utils/docker_command_utils.py | 1 + 8 files changed, 122 insertions(+), 96 deletions(-) diff --git a/.github/workflows/additional-ci-image-checks.yml b/.github/workflows/additional-ci-image-checks.yml index 1bee163b0f..8cfc9acefc 100644 --- a/.github/workflows/additional-ci-image-checks.yml +++ b/.github/workflows/additional-ci-image-checks.yml @@ -89,30 +89,31 @@ jobs: # delay cache refresh. It does not attempt to upgrade to newer dependencies. # We only push CI cache as PROD cache usually does not gain as much from fresh cache because # it uses prepared airflow and provider packages that invalidate the cache anyway most of the time - # push-early-buildx-cache-to-github-registry: - # name: Push Early Image Cache - # uses: ./.github/workflows/push-image-cache.yml - # permissions: - # contents: read - # # This write is only given here for `push` events from "apache/airflow" repo. It is not given for PRs - # # from forks. This is to prevent malicious PRs from creating images in the "apache/airflow" repo. - # # For regular build for PRS this "build-prod-images" workflow will be skipped anyway by the - # # "in-workflow-build" condition - # packages: write - # secrets: inherit - # with: - # runs-on: ${{ inputs.runs-on }} - # cache-type: "Early" - # include-prod-images: "false" - # push-latest-images: "false" - # image-tag: ${{ inputs.image-tag }} - # python-versions: ${{ inputs.python-versions }} - # branch: ${{ inputs.branch }} - # use-uv: "true" - # include-success-outputs: ${{ inputs.include-success-outputs }} - # constraints-branch: ${{ inputs.constraints-branch }} - # docker-cache: ${{ inputs.docker-cache }} - # if: inputs.canary-run == 'true' && inputs.branch == 'main' + push-early-buildx-cache-to-github-registry: +name: Push Early Image Cache +uses: ./.github/workflows/push-image-cache.yml +permissions: + contents: read + # This write is only given here for `push` events from "apache/airflow" repo. It is not given for PRs + # from forks. This is to prevent malicious PRs from creating images in the "apache/airflow" repo. + # For regular build for PRS this "build-prod-images" workflow will be skipped anyway by the + # "in-workflow-build" condition + packages: write +secrets: inherit +with: + # Runs on Public runners + cache-type: "Early" + include-prod-images: "false" + push-latest-images: "false" + image-tag: ${{ inputs.image-tag }} + platform: "linux/amd64" + python-versions: ${{ inputs.python-versions }} + branch: ${{ inputs.branch }} + constraints-branch: ${{ inputs.constraints-branch }} + use-uv: "true" + include-success-outputs: ${{ inputs.include-success-outputs }} + docker-cache: ${{ inputs.docker-cache }} +if: inputs.canary-run == 'true' && inputs.branch == 'main' # Check that after earlier cache push, breeze command will build quickly check-that-image-builds-quickly: @@ -121,7 +122,6 @@ jobs: runs-on: ["ubuntu-22.04"] env: UPGRADE_TO_NEWER_DEPENDENCIES: false - PLATFORM: "linux/amd64" PYTHON_MAJOR_MINOR_VERSION: ${{ inputs.default-python-version }} PYTHON_VERSION: ${{ inputs.default-python-version }} IMAGE_TAG: ${{ inputs.image-tag }} @@ -142,7 +142,7 @@ jobs: - name: "Login to ghcr.io" run: echo "${{ env.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin - name: "Check that image builds quickly" -run: breeze shell --max-time 120 +run: breeze shell --max-time 120 --platform "linux/amd64" # This is only a check if ARM images are successfully building when committer runs PR from # Apache repository. This is needed in case you want to fix failing cache job in "canary" run @@ -156,11 +156,11 @@ jobs: packages:
(airflow) branch test-cache-refreshing updated (90d68148c3 -> 375f9426a3)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch test-cache-refreshing in repository https://gitbox.apache.org/repos/asf/airflow.git discard 90d68148c3 Test cache refreshing in CI add 375f9426a3 Test cache refreshing in CI This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (90d68148c3) \ N -- N -- N refs/heads/test-cache-refreshing (375f9426a3) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: .github/workflows/ci-image-build.yml| 2 +- .github/workflows/prod-image-build.yml | 1 + dev/breeze/src/airflow_breeze/utils/docker_command_utils.py | 1 + 3 files changed, 3 insertions(+), 1 deletion(-)
Re: [PR] Turn optional-dependencies in pyproject.toml into dynamic property [airflow]
potiuk commented on PR #38437: URL: https://github.com/apache/airflow/pull/38437#issuecomment-2016913907 Ah. Now with traceback, it's clear. The old breeze expects "dependencies" property in `project` ... That's fine. This will be fixed once the PR is merged. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Apply task instance mutation hook consistently [airflow]
potiuk commented on PR #38440: URL: https://github.com/apache/airflow/pull/38440#issuecomment-2016912706 NIT: Maybe a test case @jscheffl ? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Turn optional-dependencies in pyproject.toml into dynamic property [airflow]
potiuk commented on code in PR #38437: URL: https://github.com/apache/airflow/pull/38437#discussion_r1536890083 ## INSTALL: ## @@ -128,19 +141,19 @@ You can see the list of available envs with: This is what it shows currently: -┏━┳━┳━━┳━━━┓ -┃ Name┃ Type┃ Features ┃ Description ┃ -┡━╇━╇━━╇━━━┩ -│ default │ virtual │ devel│ Default environment with Python 3.8 for maximum compatibility │ -├─┼─┼──┼───┤ -│ airflow-38 │ virtual │ │ Environment with Python 3.8. No devel installed. │ -├─┼─┼──┼───┤ -│ airflow-39 │ virtual │ │ Environment with Python 3.9. No devel installed. │ -├─┼─┼──┼───┤ -│ airflow-310 │ virtual │ │ Environment with Python 3.10. No devel installed. │ -├─┼─┼──┼───┤ -│ airflow-311 │ virtual │ │ Environment with Python 3.11. No devel installed │ -└─┴─┴──┴───┘ +┏━┳━┳━━━┓ +┃ Name┃ Type┃ Description ┃ +┡━╇━╇━━━┩ +│ default │ virtual │ Default environment with Python 3.8 for maximum compatibility │ +├─┼─┼───┤ +│ airflow-38 │ virtual │ Environment with Python 3.8. No devel installed. │ +├─┼─┼───┤ +│ airflow-39 │ virtual │ Environment with Python 3.9. No devel installed. │ +├─┼─┼───┤ +│ airflow-310 │ virtual │ Environment with Python 3.10. No devel installed. │ +├─┼─┼───┤ +│ airflow-311 │ virtual │ Environment with Python 3.11. No devel installed │ Review Comment: Missed it when we added 3.12 :) ## INSTALL: ## @@ -128,19 +141,19 @@ You can see the list of available envs with: This is what it shows currently: -┏━┳━┳━━┳━━━┓ -┃ Name┃ Type┃ Features ┃ Description ┃ -┡━╇━╇━━╇━━━┩ -│ default │ virtual │ devel│ Default environment with Python 3.8 for maximum compatibility │ -├─┼─┼──┼───┤ -│ airflow-38 │ virtual │ │ Environment with Python 3.8. No devel installed. │ -├─┼─┼──┼───┤ -│ airflow-39 │ virtual │ │ Environment with Python 3.9. No devel installed. │ -├─┼─┼──┼───┤ -│ airflow-310 │ virtual │ │ Environment with Python 3.10. No devel installed. │ -├─┼─┼──┼───┤ -│ airflow-311 │ virtual │ │ Environment with Python 3.11. No devel installed │ -└─┴─┴──┴───┘ +┏━┳━┳━━━┓ +┃ Name┃ Type┃ Description ┃ +┡━╇━╇━━━┩ +│ default │ virtual │ Default environment with Python 3.8 for maximum compatibility │ +├─┼─┼───┤ +│ airflow-38 │ virtual │ Environment with Python 3.8. No devel installed. │ +├─┼─┼───┤ +│ airflow-39 │ virtual │ Environment with Python 3.9. No devel installed. │
(airflow) branch turn-optional-dependencies-in-dynamic-metadata updated (02a6ee9938 -> 83135236cd)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch turn-optional-dependencies-in-dynamic-metadata in repository https://gitbox.apache.org/repos/asf/airflow.git omit 02a6ee9938 Turn optional-dependencies in pyproject.toml into dynamic property add 83135236cd Turn optional-dependencies in pyproject.toml into dynamic property This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (02a6ee9938) \ N -- N -- N refs/heads/turn-optional-dependencies-in-dynamic-metadata (83135236cd) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: INSTALL | 2 ++ 1 file changed, 2 insertions(+)
(airflow) branch turn-optional-dependencies-in-dynamic-metadata updated (2b518b4e25 -> 02a6ee9938)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch turn-optional-dependencies-in-dynamic-metadata in repository https://gitbox.apache.org/repos/asf/airflow.git omit 2b518b4e25 Turn optional-dependencies in pyproject.toml into dynamic property add 25e5d54192 Print selective-check traceback on stdout rather than stderr (#38441) add 02a6ee9938 Turn optional-dependencies in pyproject.toml into dynamic property This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (2b518b4e25) \ N -- N -- N refs/heads/turn-optional-dependencies-in-dynamic-metadata (02a6ee9938) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: contributing-docs/07_local_virtualenv.rst | 2 +- dev/breeze/doc/02_customizing.rst | 2 +- .../src/airflow_breeze/commands/ci_commands.py | 44 -- .../installation/installing-from-sources.rst | 2 +- 4 files changed, 27 insertions(+), 23 deletions(-)
(airflow) branch main updated: Print selective-check traceback on stdout rather than stderr (#38441)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a commit to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/main by this push: new 25e5d54192 Print selective-check traceback on stdout rather than stderr (#38441) 25e5d54192 is described below commit 25e5d5419216900f6cdff0f985b5c2e37afeaa9b Author: Jarek Potiuk AuthorDate: Sun Mar 24 20:14:33 2024 +0100 Print selective-check traceback on stdout rather than stderr (#38441) We are using stdout of selective-check to print diagnostic information and stderr is redirected to produce outputs for GIHUB_OUTPUTS special variable in GitHub Actions. However this means that when there is an error when running selective-checks, the traceback goes to stderr and we cannot see it, plus it makes Github Actions to fail with crypttic errors. This PR catches uncaught exception and uses rich mechanism to print the traceback to the diagnostic (stdout) console instead so that we can see it - in colour as well as with local variables, which might become handy. --- .../src/airflow_breeze/commands/ci_commands.py | 44 -- 1 file changed, 24 insertions(+), 20 deletions(-) diff --git a/dev/breeze/src/airflow_breeze/commands/ci_commands.py b/dev/breeze/src/airflow_breeze/commands/ci_commands.py index 12587e55d6..54f74f8026 100644 --- a/dev/breeze/src/airflow_breeze/commands/ci_commands.py +++ b/dev/breeze/src/airflow_breeze/commands/ci_commands.py @@ -248,26 +248,30 @@ def selective_check( github_actor: str, github_context: str, ): -from airflow_breeze.utils.selective_checks import SelectiveChecks - -github_context_dict = json.loads(github_context) if github_context else {} -github_event = GithubEvents(github_event_name) -if commit_ref is not None: -changed_files = get_changed_files(commit_ref=commit_ref) -else: -changed_files = () -sc = SelectiveChecks( -commit_ref=commit_ref, -files=changed_files, -default_branch=default_branch, -default_constraints_branch=default_constraints_branch, -pr_labels=tuple(ast.literal_eval(pr_labels)) if pr_labels else (), -github_event=github_event, -github_repository=github_repository, -github_actor=github_actor, -github_context_dict=github_context_dict, -) -print(str(sc), file=sys.stderr) +try: +from airflow_breeze.utils.selective_checks import SelectiveChecks + +github_context_dict = json.loads(github_context) if github_context else {} +github_event = GithubEvents(github_event_name) +if commit_ref is not None: +changed_files = get_changed_files(commit_ref=commit_ref) +else: +changed_files = () +sc = SelectiveChecks( +commit_ref=commit_ref, +files=changed_files, +default_branch=default_branch, +default_constraints_branch=default_constraints_branch, +pr_labels=tuple(ast.literal_eval(pr_labels)) if pr_labels else (), +github_event=github_event, +github_repository=github_repository, +github_actor=github_actor, +github_context_dict=github_context_dict, +) +print(str(sc), file=sys.stderr) +except Exception: +get_console().print_exception(show_locals=True) +sys.exit(1) TEST_BRANCH_MATCHER = re.compile(r"^v.*test$")
Re: [PR] Print selective-check traceback on stdout rather than stderr [airflow]
potiuk commented on PR #38441: URL: https://github.com/apache/airflow/pull/38441#issuecomment-2016911399 > An alternative would be use use 2>&1 in the shell call to redirect all stderr to stdout? > In total meaning: breeze ci selective-check 2>&1 | tee ${GITHUB_OUTPUT} ? We already do that actually https://github.com/apache/airflow/blob/694826d1bd0a1593e676deed862519fac73266a4/.github/workflows/build-images.yml#L154 We print diagnostics information to stdout so that it can be seen in colour - for example here https://github.com/apache/airflow/actions/runs/8411260687/job/23030711777?pr=38437#step:8:849 In the past I also tried to revert it - i.e. print all the diagnostics to stderr, and then print the outputs to stdout and do `> ${GITHUB_OUTPUT}` - but it was far too easy to overlook something and it's better to redirect stderr to GITHUB_OUTPUTS and leave stdout for diagnostics. Then we only have to catch all the exceptions like this and print traceback to stdout, it's far easier than trace down or pass "stderr" as the diagnostic output. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Print selective-check traceback on stdout rather than stderr [airflow]
potiuk merged PR #38441: URL: https://github.com/apache/airflow/pull/38441 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Turn optional-dependencies in pyproject.toml into dynamic property [airflow]
jscheffl commented on code in PR #38437: URL: https://github.com/apache/airflow/pull/38437#discussion_r1536887077 ## INSTALL: ## @@ -128,19 +141,19 @@ You can see the list of available envs with: This is what it shows currently: -┏━┳━┳━━┳━━━┓ -┃ Name┃ Type┃ Features ┃ Description ┃ -┡━╇━╇━━╇━━━┩ -│ default │ virtual │ devel│ Default environment with Python 3.8 for maximum compatibility │ -├─┼─┼──┼───┤ -│ airflow-38 │ virtual │ │ Environment with Python 3.8. No devel installed. │ -├─┼─┼──┼───┤ -│ airflow-39 │ virtual │ │ Environment with Python 3.9. No devel installed. │ -├─┼─┼──┼───┤ -│ airflow-310 │ virtual │ │ Environment with Python 3.10. No devel installed. │ -├─┼─┼──┼───┤ -│ airflow-311 │ virtual │ │ Environment with Python 3.11. No devel installed │ -└─┴─┴──┴───┘ +┏━┳━┳━━━┓ +┃ Name┃ Type┃ Description ┃ +┡━╇━╇━━━┩ +│ default │ virtual │ Default environment with Python 3.8 for maximum compatibility │ +├─┼─┼───┤ +│ airflow-38 │ virtual │ Environment with Python 3.8. No devel installed. │ +├─┼─┼───┤ +│ airflow-39 │ virtual │ Environment with Python 3.9. No devel installed. │ +├─┼─┼───┤ +│ airflow-310 │ virtual │ Environment with Python 3.10. No devel installed. │ +├─┼─┼───┤ +│ airflow-311 │ virtual │ Environment with Python 3.11. No devel installed │ Review Comment: airflow-312 is missing in this table if I correctly take a look? At least I can tell it is working for me :-D ```suggestion │ airflow-311 │ virtual │ Environment with Python 3.11. No devel installed │ ├─┼─┼───┤ │ airflow-312 │ virtual │ Environment with Python 3.12. No devel installed │``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch test-cache-refreshing updated (106bdf172b -> 90d68148c3)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch test-cache-refreshing in repository https://gitbox.apache.org/repos/asf/airflow.git discard 106bdf172b Test cache refreshing in CI add 078e7ed135 Mysql change xcom value col type for MySQL backend (#38401) add 6d7563de33 Optimize number of versions run when CI scripts change (#38426) add 8520778930 Temporarily remove protection on v2-9-stable to create beta 2 (#38431) add 3ac0aaf748 Fix `parent_model` parameter in GCP Vertex AI AutoML and Custom Job operators (#38417) add 3e92409d82 bump ruff to 0.3.4 (#38433) add be72d6a258 Resolve `PT012` in `common.sql`, `datadog`, `dbt`, and `jenkins` providers tests (#38429) add c305ec7bf6 Update screenshot for extra link description as old Grid view is gone (#38435) add dd97086416 Resolve PT012 in `microsoft.azure`, `microsoft.psrp`, and `oracle` providers tests (#38436) add 0cc410b5b8 Updat build dependencies after hatchling update (#38439) add d3e9229105 Bugfix/update screenshots (#38438) add 817900 Fix deprecated `DockerOperator` operator arguments in `MappedOperator` (#38379) add d820f13e68 Use `relative-imports (TID252)` instead of `pygrep` rule (#38428) add 4a4eee113f Delete deprecated AutoML operators and deprecate AutoML hook and links (#38418) add 77d2fc7d75 Check task attribute before use in sentry.add_tagging() (#37143) add 694826d1bd refactor: Refactored __new__ magic method of BaseOperatorMeta to avoid bad mixing classic and decorated operators (#37937) add 90d68148c3 Test cache refreshing in CI This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (106bdf172b) \ N -- N -- N refs/heads/test-cache-refreshing (90d68148c3) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: .asf.yaml |8 +- .pre-commit-config.yaml| 12 +- airflow/example_dags/tutorial.py |2 +- ...2_9_0_make_xcom_value_to_longblob_for_mysql.py} | 27 +- airflow/models/base.py |1 + airflow/models/baseoperator.py | 61 +- airflow/models/mappedoperator.py |4 + airflow/models/taskinstance.py |8 +- airflow/models/xcom.py |3 +- airflow/providers/docker/operators/docker.py |7 +- airflow/providers/google/cloud/hooks/automl.py | 758 +++- airflow/providers/google/cloud/links/automl.py | 17 +- airflow/providers/google/cloud/operators/automl.py | 1255 +--- .../google/cloud/operators/vertex_ai/auto_ml.py|5 - .../google/cloud/operators/vertex_ai/custom_job.py |5 - airflow/providers/google/provider.yaml | 11 - .../microsoft/azure/serialization}/__init__.py |0 airflow/sentry.py | 196 --- .../models/base_user.py => sentry/__init__.py} | 21 +- .../{listeners/spec/dagrun.py => sentry/blank.py} | 26 +- airflow/sentry/configured.py | 176 +++ airflow/serialization/schema.json |3 +- contributing-docs/08_static_code_checks.rst|2 - dev/breeze/doc/ci/04_selective_checks.md | 153 +-- dev/breeze/doc/ci/07_debugging.md | 17 +- dev/breeze/doc/images/output_static-checks.svg | 130 +- dev/breeze/doc/images/output_static-checks.txt |2 +- dev/breeze/src/airflow_breeze/pre_commit_ids.py|1 - .../src/airflow_breeze/utils/selective_checks.py | 106 +- dev/breeze/tests/test_selective_checks.py | 84 +- docker_tests/test_prod_image.py|1 - .../operators/cloud/automl.rst | 229 docs/apache-airflow/core-concepts/dag-run.rst |8 + docs/apache-airflow/howto/define-extra-link.rst|3 + docs/apache-airflow/img/add-dag-tags.png | Bin 68537 -> 54332 bytes docs/apache-airflow/img/airflow_erd.sha256 |2 +- docs/apache-airflow/img/airflow_erd.svg|8 +- docs/apache-airflow/img/basic-dag.png | Bin 5393 -> 12837 bytes docs/apache-airflow/img/branch_note.png| Bin 31771 -> 31259 bytes docs/apache-airflow/img/branch_with_trigger.png|
Re: [PR] Print selective-check traceback on stdout rather than stderr [airflow]
potiuk commented on PR #38441: URL: https://github.com/apache/airflow/pull/38441#issuecomment-2016903858 Should help to diagnose issues like https://github.com/apache/airflow/actions/runs/8410545233/job/23029279435 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[PR] Print selective-check traceback on stdout rather than stderr [airflow]
potiuk opened a new pull request, #38441: URL: https://github.com/apache/airflow/pull/38441 We are using stdout of selective-check to print diagnostic information and stderr is redirected to produce outputs for GIHUB_OUTPUTS special variable in GitHub Actions. However this means that when there is an error when running selective-checks, the traceback goes to stderr and we cannot see it, plus it makes Github Actions to fail with crypttic errors. This PR catches uncaught exception and uses rich mechanism to print the traceback to the diagnostic (stdout) console instead so that we can see it - in colour as well as with local variables, which might become handy. --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Turn optional-dependencies in pyproject.toml into dynamic property [airflow]
potiuk commented on PR #38437: URL: https://github.com/apache/airflow/pull/38437#issuecomment-2016894701 Updated documentation -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Turn optional-dependencies in pyproject.toml into dynamic property [airflow]
potiuk commented on PR #38437: URL: https://github.com/apache/airflow/pull/38437#issuecomment-2016894632 > If I run into this we should handle this explicitly. Is it possible to fail install if pip version is too old? No. But it ONLY affects you if you want to install airflow from sources for development. It should not affect wheel installation, because generally this change does not impact the way how `METADATA` is stores in the .wheel - this happens during `hatch build` so it really affects the "builders" not "users" of the packages and local contributors. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch constraints-main updated: Updating constraints. Github run id:8410961843
This is an automated email from the ASF dual-hosted git repository. github-bot pushed a commit to branch constraints-main in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/constraints-main by this push: new 368b7511da Updating constraints. Github run id:8410961843 368b7511da is described below commit 368b7511da6f1390257a4728e7540d7cdafad81e Author: Automated GitHub Actions commit AuthorDate: Sun Mar 24 18:23:25 2024 + Updating constraints. Github run id:8410961843 This update in constraints is automatically committed by the CI 'constraints-push' step based on 'refs/heads/main' in the 'apache/airflow' repository with commit sha 694826d1bd0a1593e676deed862519fac73266a4. The action that build those constraints can be found at https://github.com/apache/airflow/actions/runs/8410961843/ The image tag used for that build was: 694826d1bd0a1593e676deed862519fac73266a4. You can enter Breeze environment with this image by running 'breeze shell --image-tag 694826d1bd0a1593e676deed862519fac73266a4' All tests passed in this build so we determined we can push the updated constraints. See https://github.com/apache/airflow/blob/main/README.md#installing-from-pypi for details. --- constraints-3.10.txt | 6 +++--- constraints-3.11.txt | 6 +++--- constraints-3.12.txt | 6 +++--- constraints-3.8.txt | 6 +++--- constraints-3.9.txt | 6 +++--- constraints-no-providers-3.10.txt | 4 ++-- constraints-no-providers-3.11.txt | 4 ++-- constraints-no-providers-3.12.txt | 4 ++-- constraints-no-providers-3.8.txt | 4 ++-- constraints-no-providers-3.9.txt | 4 ++-- constraints-source-providers-3.10.txt | 7 +++ constraints-source-providers-3.11.txt | 7 +++ constraints-source-providers-3.12.txt | 7 +++ constraints-source-providers-3.8.txt | 7 +++ constraints-source-providers-3.9.txt | 7 +++ 15 files changed, 40 insertions(+), 45 deletions(-) diff --git a/constraints-3.10.txt b/constraints-3.10.txt index ba36cdb32f..320e31cb40 100644 --- a/constraints-3.10.txt +++ b/constraints-3.10.txt @@ -1,6 +1,6 @@ # -# This constraints file was automatically generated on 2024-03-24T10:25:00.616478 +# This constraints file was automatically generated on 2024-03-24T17:46:09.824611 # via "eager-upgrade" mechanism of PIP. For the "main" branch of Airflow. # This variant of constraints install uses the HEAD of the branch version for 'apache-airflow' but installs # the providers from PIP-released packages at the moment of the constraint generation. @@ -54,14 +54,14 @@ Mako==1.3.2 Markdown==3.6 MarkupSafe==2.1.5 PyAthena==3.5.1 -PyGithub==2.2.0 +PyGithub==2.3.0 PyHive==0.7.0 PyJWT==2.8.0 PyNaCl==1.5.0 PyYAML==6.0.1 Pygments==2.17.2 SQLAlchemy-JSONField==1.0.2 -SQLAlchemy-Utils==0.41.1 +SQLAlchemy-Utils==0.41.2 SQLAlchemy==1.4.52 SecretStorage==3.3.3 Sphinx==5.3.0 diff --git a/constraints-3.11.txt b/constraints-3.11.txt index 6eb21a0f4d..0b6b173a93 100644 --- a/constraints-3.11.txt +++ b/constraints-3.11.txt @@ -1,6 +1,6 @@ # -# This constraints file was automatically generated on 2024-03-24T10:25:00.527592 +# This constraints file was automatically generated on 2024-03-24T17:46:09.844966 # via "eager-upgrade" mechanism of PIP. For the "main" branch of Airflow. # This variant of constraints install uses the HEAD of the branch version for 'apache-airflow' but installs # the providers from PIP-released packages at the moment of the constraint generation. @@ -54,14 +54,14 @@ Mako==1.3.2 Markdown==3.6 MarkupSafe==2.1.5 PyAthena==3.5.1 -PyGithub==2.2.0 +PyGithub==2.3.0 PyHive==0.7.0 PyJWT==2.8.0 PyNaCl==1.5.0 PyYAML==6.0.1 Pygments==2.17.2 SQLAlchemy-JSONField==1.0.2 -SQLAlchemy-Utils==0.41.1 +SQLAlchemy-Utils==0.41.2 SQLAlchemy==1.4.52 SecretStorage==3.3.3 Sphinx==5.3.0 diff --git a/constraints-3.12.txt b/constraints-3.12.txt index f864c3bdcd..3e47ef433f 100644 --- a/constraints-3.12.txt +++ b/constraints-3.12.txt @@ -1,6 +1,6 @@ # -# This constraints file was automatically generated on 2024-03-24T10:25:57.681188 +# This constraints file was automatically generated on 2024-03-24T17:47:01.682815 # via "eager-upgrade" mechanism of PIP. For the "main" branch of Airflow. # This variant of constraints install uses the HEAD of the branch version for 'apache-airflow' but installs # the providers from PIP-released packages at the moment of the constraint generation. @@ -53,14 +53,14 @@ Mako==1.3.2 Markdown==3.6 MarkupSafe==2.1.5 PyAthena==3.5.1 -PyGithub==2.2.0 +PyGithub==2.3.0 PyHive==0.7.0 PyJWT==2.8.0 PyNaCl==1.5.0 PyYAML==6.0.1 Pygments==2.17.2 SQLAlchemy-JSONField==1.0.2 -SQLAlchemy-Utils==0.41.1 +SQLAlchemy-Utils==0.41.2 SQLAlchemy==1.4.52 SecretStorage==3.3.3 Sphinx==5.3.0 diff --git a/constraints-3.8.txt
(airflow) branch turn-optional-dependencies-in-dynamic-metadata updated (902d144946 -> 2b518b4e25)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch turn-optional-dependencies-in-dynamic-metadata in repository https://gitbox.apache.org/repos/asf/airflow.git discard 902d144946 Turn optional-dependencies in pyproject.toml into dynamic property add 2b518b4e25 Turn optional-dependencies in pyproject.toml into dynamic property This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (902d144946) \ N -- N -- N refs/heads/turn-optional-dependencies-in-dynamic-metadata (2b518b4e25) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: INSTALL | 10 ++ contributing-docs/07_local_virtualenv.rst| 10 ++ docs/apache-airflow/installation/installing-from-sources.rst | 6 ++ 3 files changed, 26 insertions(+)
Re: [PR] Turn optional-dependencies in pyproject.toml into dynamic property [airflow]
jscheffl commented on PR #38437: URL: https://github.com/apache/airflow/pull/38437#issuecomment-2016892056 > Actually looks like `pip 22.0` is the LAST one that does not support this dynamic retrieval of dependencies. So you picked really edge'y case @jscheffl. I will add this to the docs. That is why it is good to test :-D If I run into this we should handle this explicitly. Is it possible to fail install if pip version is too old? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Turn optional-dependencies in pyproject.toml into dynamic property [airflow]
potiuk commented on PR #38437: URL: https://github.com/apache/airflow/pull/38437#issuecomment-2016891091 Actually looks like `pip 22.0` is the LAST one that does not support this dynamic retrieval of dependencies. So you picked really edge'y case @jscheffl. I will add this to the docs. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch turn-optional-dependencies-in-dynamic-metadata updated (e1139449bd -> 902d144946)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch turn-optional-dependencies-in-dynamic-metadata in repository https://gitbox.apache.org/repos/asf/airflow.git discard e1139449bd Turn optional-dependencies in pyproject.toml into dynamic property add 902d144946 Turn optional-dependencies in pyproject.toml into dynamic property This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (e1139449bd) \ N -- N -- N refs/heads/turn-optional-dependencies-in-dynamic-metadata (902d144946) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: dev/breeze/doc/images/output_static-checks.svg | 2 +- dev/breeze/doc/images/output_static-checks.txt | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-)
(airflow) branch turn-optional-dependencies-in-dynamic-metadata updated (8343dabf24 -> e1139449bd)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch turn-optional-dependencies-in-dynamic-metadata in repository https://gitbox.apache.org/repos/asf/airflow.git discard 8343dabf24 Turn optional-dependencies in pyproject.toml into dynamic property add d3e9229105 Bugfix/update screenshots (#38438) add 817900 Fix deprecated `DockerOperator` operator arguments in `MappedOperator` (#38379) add d820f13e68 Use `relative-imports (TID252)` instead of `pygrep` rule (#38428) add 4a4eee113f Delete deprecated AutoML operators and deprecate AutoML hook and links (#38418) add 77d2fc7d75 Check task attribute before use in sentry.add_tagging() (#37143) add 694826d1bd refactor: Refactored __new__ magic method of BaseOperatorMeta to avoid bad mixing classic and decorated operators (#37937) add e1139449bd Turn optional-dependencies in pyproject.toml into dynamic property This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (8343dabf24) \ N -- N -- N refs/heads/turn-optional-dependencies-in-dynamic-metadata (e1139449bd) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: .pre-commit-config.yaml|8 - airflow/example_dags/tutorial.py |2 +- airflow/models/base.py |1 + airflow/models/baseoperator.py | 61 +- airflow/models/mappedoperator.py |4 + airflow/models/taskinstance.py |8 +- airflow/providers/docker/operators/docker.py |7 +- airflow/providers/google/cloud/hooks/automl.py | 758 +++- airflow/providers/google/cloud/links/automl.py | 17 +- airflow/providers/google/cloud/operators/automl.py | 1255 +--- airflow/providers/google/provider.yaml | 11 - .../microsoft/azure/serialization}/__init__.py |0 airflow/sentry.py | 196 --- .../models/base_user.py => sentry/__init__.py} | 21 +- .../{listeners/spec/dagrun.py => sentry/blank.py} | 26 +- airflow/sentry/configured.py | 176 +++ airflow/serialization/schema.json |3 +- contributing-docs/08_static_code_checks.rst|2 - dev/breeze/src/airflow_breeze/pre_commit_ids.py|1 - docker_tests/test_prod_image.py|1 - .../operators/cloud/automl.rst | 229 docs/apache-airflow/core-concepts/dag-run.rst |8 + docs/apache-airflow/img/add-dag-tags.png | Bin 68537 -> 54332 bytes docs/apache-airflow/img/basic-dag.png | Bin 5393 -> 12837 bytes docs/apache-airflow/img/branch_note.png| Bin 31771 -> 31259 bytes docs/apache-airflow/img/branch_with_trigger.png| Bin 34081 -> 28541 bytes docs/apache-airflow/img/branch_without_trigger.png | Bin 39815 -> 28914 bytes docs/apache-airflow/img/calendar.png | Bin 58726 -> 34689 bytes .../default_instance_name_configuration.png| Bin 154888 -> 68441 bytes .../example_instance_name_configuration.png| Bin 154915 -> 68231 bytes .../img/change-ui-colors/dags-page-new.png | Bin 483599 -> 74143 bytes .../img/change-ui-colors/dags-page-old.png | Bin 493009 -> 72288 bytes .../img/change-ui-colors/graph-view-new.png| Bin 56973 -> 34492 bytes .../img/change-ui-colors/graph-view-old.png| Bin 54884 -> 34721 bytes .../img/change-ui-colors/tree-view-new.png | Bin 36934 -> 20188 bytes .../img/change-ui-colors/tree-view-old.png | Bin 21601 -> 20146 bytes docs/apache-airflow/img/context.png| Bin 124467 -> 0 bytes docs/apache-airflow/img/dag_doc.png| Bin 40094 -> 93757 bytes docs/apache-airflow/img/edge_label_example.png | Bin 24592 -> 33270 bytes docs/apache-airflow/img/example_passing_conf.png | Bin 41080 -> 53383 bytes .../img/latest_only_with_trigger.png | Bin 42887 -> 59580 bytes docs/apache-airflow/img/mapping-simple-graph.png | Bin 13312 -> 7731 bytes docs/apache-airflow/img/mapping-simple-grid.png| Bin 179670 -> 93751 bytes docs/apache-airflow/img/nested_branching.png | Bin 31847 -> 0 bytes docs/apache-airflow/img/scheduler_loop.jpg | Bin 46864 -> 0 bytes
Re: [PR] Turn optional-dependencies in pyproject.toml into dynamic property [airflow]
potiuk commented on PR #38437: URL: https://github.com/apache/airflow/pull/38437#issuecomment-2016887762 General advice from `pip` team is to always upgrade to newest version. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Turn optional-dependencies in pyproject.toml into dynamic property [airflow]
potiuk commented on PR #38437: URL: https://github.com/apache/airflow/pull/38437#issuecomment-2016887543 Yes. You must upgrade your `pip`. Version 22 is from 2022 and a lot of the functionality and PEPs needed have been approved and implemented later than that. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Turn optional-dependencies in pyproject.toml into dynamic property [airflow]
jscheffl commented on PR #38437: URL: https://github.com/apache/airflow/pull/38437#issuecomment-2016885982 I tried to "naively" install as described in my devenv. 1. Build local CI Image: works 2. Install via pip in local env: FAILS (see 1) 1) Details of pip install in Xubuntu 22.04 x64: ``` (.venv-breeze) jscheffl@hp860g9:~/Workspace/airflow$ python3 -m venv PATH_TO_YOUR_VENV (.venv-breeze) jscheffl@hp860g9:~/Workspace/airflow$ source PATH_TO_YOUR_VENV/bin/activate (PATH_TO_YOUR_VENV) (.venv-breeze) jscheffl@hp860g9:~/Workspace/airflow$ pip install -e ".[devel]" Obtaining file:///home/jscheffl/Workspace/airflow Installing build dependencies ... done Checking if build backend supports build_editable ... done Getting requirements to build editable ... done Installing backend dependencies ... done Preparing editable metadata (pyproject.toml) ... done WARNING: apache-airflow 2.9.0.dev0 does not provide the extra 'devel' Building wheels for collected packages: apache-airflow Building editable for apache-airflow (pyproject.toml) ... done Created wheel for apache-airflow: filename=apache_airflow-2.9.0.dev0-py3-none-any.whl size=67494 sha256=a2a230edd149efea5937bcc8f9e857ce1a65b349ce365cfb3d5da2777b491a66 Stored in directory: /tmp/pip-ephem-wheel-cache-rgbnftf3/wheels/82/15/4a/bc39417e37a8668eb5fd65e3ae91c2adaba2a1bb006639f92e Successfully built apache-airflow Installing collected packages: apache-airflow Successfully installed apache-airflow-2.9.0.dev0 (PATH_TO_YOUR_VENV) (.venv-breeze) jscheffl@hp860g9:~/Workspace/airflow$ pip list PackageVersionEditable project location -- -- apache-airflow 2.9.0.dev0 /home/jscheffl/Workspace/airflow pip22.0.2 setuptools 59.6.0 ``` ...seems besides the airflow package no other dependencies are installed. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[PR] Apply task instance mutation hook consistently [airflow]
jscheffl opened a new pull request, #38440: URL: https://github.com/apache/airflow/pull/38440 We are using the [Cluster Policies](https://airflow.apache.org/docs/apache-airflow/stable/administration-and-deployment/cluster-policies.html#task-instance-mutation) and in the the feature of the "Task Instance Mutation" to route workload to the respective endpoint. Respective endpoint means that we use multiple Celery queues and distribute the work. As the distribution is based on workflow meta data and we don't want to add the routing complexity into the workflow (modelling the workflow statically for all routing combinations) the task instance mutation is the only option. As discussed in #32471 we have seen that the task instance mutation works in general "well" for the first execution but we saw a couple of errors: - When using task_instance_mutation_hook the UI in DAGs->Grid->Task Details->More Details always shows the task definition value of queue and not the mutated value, which actually is stored in DB. More worse, when navigating in the UI the existing queue value in DB is reset to standard queue value w/o hook applied - When task fails, the retry does not apply the mutation hook and the task will go to standard queue again - When using dynamic task mapping, only first mapped task receives the queue from the mutation hook (later created during mapping not) Root cause is that after initial task creation defaults are loaded from python code many times on multiple levels. Root casue seems to be `TaskInstance._refresh_from_task()`. Fixing these to lines as in this PR removes all problems as described above. Trade-off will be that the policy code is executed a lot more often. But assuming this is not implemented with performance overhead it should not generate a performance impact. How to test: - Apply a cluster policy that changes the `queue` on some (or all :-D) tasks - Use for example the `example_params_trigger_ui` and introduce some random errors in the code. Example attached below. - Run this, ensure you have celcery workers serving the default and the "other_queue". I was setting an env `QUEUE` for the queue worker to print this in the DAG when testing - Check logs of failed tasks, mapped tasks that are not the first ones and UI display for "queue" field closes: #32471 FYI @AutomationDev85 @wolfdn @clellmann Example cluster policy used for testing as `airflow_local_settings.py`: ``` from airflow.models.taskinstance import TaskInstance def task_instance_mutation_hook(task_instance: TaskInstance): print("# POLICY IS APPLIED! ##") task_instance.queue = "other_queue" ``` Modified DAG for testing - `example_params_trigger_ui.py`: ``` from __future__ import annotations import datetime from random import randint from pathlib import Path from os import getenv from typing import TYPE_CHECKING from airflow.decorators import task from airflow.models.dag import DAG from airflow.models.param import Param from airflow.utils.trigger_rule import TriggerRule if TYPE_CHECKING: from airflow.models.dagrun import DagRun from airflow.models.taskinstance import TaskInstance def print_where_executed(): print("") print(f"This taks is executed on queue {getenv('QUEUE', 'UNDEFINED!')}") print("") with DAG( dag_id=Path(__file__).stem, description=__doc__.partition(".")[0], doc_md=__doc__, schedule=None, start_date=datetime.datetime(2022, 3, 4), catchup=False, tags=["example_ui"], params={ "names": Param( ["Linda", "Martha", "Thomas"], type="array", description="Define the list of names for which greetings should be generated in the logs." " Please have one name per line.", title="Names to greet", ), "english": Param(True, type="boolean", title="English"), "german": Param(True, type="boolean", title="German (Formal)"), "french": Param(True, type="boolean", title="French"), }, ) as dag: @task(task_id="get_names", retries=4, retry_delay=5.0) def get_names(**kwargs) -> list[str]: ti: TaskInstance = kwargs["ti"] dag_run: DagRun = ti.dag_run print_where_executed() if randint(0, 1) > 0: raise Exception("Something went wrong!") if "names" not in dag_run.conf: print("Uuups, no names given, was no UI used to trigger?") return [] return dag_run.conf["names"] @task.branch(task_id="select_languages", retries=4, retry_delay=5.0)
Re: [PR] Add executor field to the DB and parameter to the operators [airflow]
potiuk commented on PR #38054: URL: https://github.com/apache/airflow/pull/38054#issuecomment-2016878718 Yep. We cannot merge it without resolving conflict :( -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch main updated (77d2fc7d75 -> 694826d1bd)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git from 77d2fc7d75 Check task attribute before use in sentry.add_tagging() (#37143) add 694826d1bd refactor: Refactored __new__ magic method of BaseOperatorMeta to avoid bad mixing classic and decorated operators (#37937) No new revisions were added by this update. Summary of changes: airflow/models/base.py | 1 + airflow/models/baseoperator.py | 61 ++- airflow/models/mappedoperator.py | 4 + airflow/models/taskinstance.py | 8 +- .../microsoft/azure/serialization}/__init__.py | 0 airflow/serialization/schema.json | 3 +- tests/models/test_baseoperatormeta.py | 184 + tests/serialization/test_dag_serialization.py | 1 + 8 files changed, 257 insertions(+), 5 deletions(-) copy airflow/{api_connexion => providers/microsoft/azure/serialization}/__init__.py (100%) create mode 100644 tests/models/test_baseoperatormeta.py
Re: [PR] refactor: Refactored __new__ magic method of BaseOperatorMeta to avoid bad mixing classic and decorated operators [airflow]
potiuk merged PR #37937: URL: https://github.com/apache/airflow/pull/37937 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch main updated: Check task attribute before use in sentry.add_tagging() (#37143)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a commit to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/main by this push: new 77d2fc7d75 Check task attribute before use in sentry.add_tagging() (#37143) 77d2fc7d75 is described below commit 77d2fc7d7591679aa99c1924daba678463a7b7bb Author: Lipu Fei AuthorDate: Sun Mar 24 18:20:08 2024 +0100 Check task attribute before use in sentry.add_tagging() (#37143) * Check task attribute before use in add_tagging * Refactor sentry and add tests - Co-authored-by: Lipu Fei --- airflow/sentry.py| 196 --- airflow/sentry/__init__.py | 29 +++ airflow/sentry/blank.py | 40 + airflow/sentry/configured.py | 176 ++ tests/test_sentry.py | 65 ++ 5 files changed, 310 insertions(+), 196 deletions(-) diff --git a/airflow/sentry.py b/airflow/sentry.py deleted file mode 100644 index d5fbf3c04d..00 --- a/airflow/sentry.py +++ /dev/null @@ -1,196 +0,0 @@ -# -# Licensed to the Apache Software Foundation (ASF) under one -# or more contributor license agreements. See the NOTICE file -# distributed with this work for additional information -# regarding copyright ownership. The ASF licenses this file -# to you under the Apache License, Version 2.0 (the -# "License"); you may not use this file except in compliance -# with the License. You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, -# software distributed under the License is distributed on an -# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY -# KIND, either express or implied. See the License for the -# specific language governing permissions and limitations -# under the License. -"""Sentry Integration.""" - -from __future__ import annotations - -import logging -from functools import wraps -from typing import TYPE_CHECKING - -from airflow.configuration import conf -from airflow.executors.executor_loader import ExecutorLoader -from airflow.utils.session import find_session_idx, provide_session -from airflow.utils.state import TaskInstanceState - -if TYPE_CHECKING: -from sqlalchemy.orm import Session - -from airflow.models.taskinstance import TaskInstance - -log = logging.getLogger(__name__) - - -class DummySentry: -"""Blank class for Sentry.""" - -def add_tagging(self, task_instance): -"""Blank function for tagging.""" - -def add_breadcrumbs(self, task_instance, session: Session | None = None): -"""Blank function for breadcrumbs.""" - -def enrich_errors(self, run): -"""Blank function for formatting a TaskInstance._run_raw_task.""" -return run - -def flush(self): -"""Blank function for flushing errors.""" - - -Sentry: DummySentry = DummySentry() -if conf.getboolean("sentry", "sentry_on", fallback=False): -import sentry_sdk -from sentry_sdk.integrations.flask import FlaskIntegration -from sentry_sdk.integrations.logging import ignore_logger - -class ConfiguredSentry(DummySentry): -"""Configure Sentry SDK.""" - -SCOPE_DAG_RUN_TAGS = frozenset(("data_interval_end", "data_interval_start", "execution_date")) -SCOPE_TASK_INSTANCE_TAGS = frozenset(("task_id", "dag_id", "try_number")) -SCOPE_CRUMBS = frozenset(("task_id", "state", "operator", "duration")) - -UNSUPPORTED_SENTRY_OPTIONS = frozenset( -( -"integrations", -"in_app_include", -"in_app_exclude", -"ignore_errors", -"before_breadcrumb", -) -) - -def __init__(self): -"""Initialize the Sentry SDK.""" -ignore_logger("airflow.task") - -sentry_flask = FlaskIntegration() - -# LoggingIntegration is set by default. -integrations = [sentry_flask] - -executor_class, _ = ExecutorLoader.import_default_executor_cls(validate=False) - -if executor_class.supports_sentry: -from sentry_sdk.integrations.celery import CeleryIntegration - -sentry_celery = CeleryIntegration() -integrations.append(sentry_celery) - -dsn = None -sentry_config_opts = conf.getsection("sentry") or {} -if sentry_config_opts: -sentry_config_opts.pop("sentry_on") -old_way_dsn = sentry_config_opts.pop("sentry_dsn", None) -new_way_dsn = sentry_config_opts.pop("dsn", None) -# supported backward compatibility with old way dsn option -dsn = old_way_dsn or new_way_dsn - -unsupported_options =
Re: [PR] Check task attribute before use in sentry.add_tagging() [airflow]
potiuk merged PR #37143: URL: https://github.com/apache/airflow/pull/37143 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch main updated (d820f13e68 -> 4a4eee113f)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git from d820f13e68 Use `relative-imports (TID252)` instead of `pygrep` rule (#38428) add 4a4eee113f Delete deprecated AutoML operators and deprecate AutoML hook and links (#38418) No new revisions were added by this update. Summary of changes: airflow/providers/google/cloud/hooks/automl.py | 758 +++- airflow/providers/google/cloud/links/automl.py | 17 +- airflow/providers/google/cloud/operators/automl.py | 1255 +--- airflow/providers/google/provider.yaml | 11 - docker_tests/test_prod_image.py|1 - .../operators/cloud/automl.rst | 229 generated/provider_dependencies.json |1 - pyproject.toml |1 - .../in_container/run_provider_yaml_files_check.py |7 + tests/always/test_project_structure.py |2 - tests/providers/google/cloud/hooks/test_automl.py | 302 ++--- .../google/cloud/operators/test_automl.py | 640 +- .../google/cloud/automl/example_automl_dataset.py | 201 .../google/cloud/automl/example_automl_model.py| 288 - .../example_automl_nl_text_classification.py |2 - .../automl/example_automl_nl_text_extraction.py|7 +- .../automl/example_automl_nl_text_sentiment.py |7 +- .../cloud/automl/example_automl_translation.py | 181 --- 18 files changed, 267 insertions(+), 3643 deletions(-) delete mode 100644 docs/apache-airflow-providers-google/operators/cloud/automl.rst delete mode 100644 tests/system/providers/google/cloud/automl/example_automl_dataset.py delete mode 100644 tests/system/providers/google/cloud/automl/example_automl_model.py delete mode 100644 tests/system/providers/google/cloud/automl/example_automl_translation.py
Re: [PR] Delete deprecated AutoML operators (inc. 'google-cloud-automl' dep.) and deprecate hook and links [airflow]
potiuk merged PR #38418: URL: https://github.com/apache/airflow/pull/38418 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Improve logging readability with DockerOperator [airflow]
potiuk commented on code in PR #38008: URL: https://github.com/apache/airflow/pull/38008#discussion_r1536864652 ## airflow/providers/docker/operators/docker.py: ## @@ -423,13 +447,19 @@ def _run_image_with_mounts(self, target_mounts, add_tmp_variable: bool) -> list[ ) logstream = self.cli.attach(container=self.container["Id"], stdout=True, stderr=True, stream=True) try: -self.cli.start(self.container["Id"]) - -log_lines = [] -for log_chunk in logstream: -log_chunk = stringify(log_chunk).strip() -log_lines.append(log_chunk) -self.log.info("%s", log_chunk) +if self.container_log_formatter is not None: +self._change_log_formatters(self.container_log_formatter) + +try: +self.cli.start(self.container["Id"]) + +log_lines = [] +for log_chunk in logstream: +log_chunk = stringify(log_chunk).strip() +log_lines.append(log_chunk) +self.log.info("%s", log_chunk) +finally: +self._restore_log_formatters() Review Comment: Right now this is not yet complete. What happens if `self.container_log_formatter` is Nonne - we will still run `_restore` it seems? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Adding Task ids in DAG details API endpoint #37564 [airflow]
msoni1369 commented on PR #37866: URL: https://github.com/apache/airflow/pull/37866#issuecomment-2016869645 > If we call the field `tasks`, users will expect more information, not just `task_ids`. You may change the field to task_ids. In the future, we can deprecate that and have `tasks` fields that would have more information about the said tasks @ephraimbuddy, that's a good suggestion. I will go ahead and rename the field as you suggested. Regarding @hussein-awala's suggestion, should we include only the task_ids as of now or have an identical response as the /dags/{dag_id}/tasks endpoint? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Support adding custom TI Deps to help DagRun make more flexible TI scheduling decisions [airflow]
potiuk commented on PR #37778: URL: https://github.com/apache/airflow/pull/37778#issuecomment-2016865953 Agree with TP that if we were to merge this one, we will need quite a bit more: * documentation with some examples * general use cases where it would be available * some performance considerations (i.e. what those who create the custom deps should be aware of when writing their own deps * dos and don'ts * possibly some way of detecting when someone is doing bad things there (not sure how it should look like). The big difference vs. what we have now with "built-in" deps, is that scheduler gives control to "user" code in a very, very "hot area" in the code. Unlike most of the other code we allow our users to configure in Airflow, this one wlll have a crucial and very serious impact on the scheduling performance. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch main updated (81790aaaa0 -> d820f13e68)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git from 817900 Fix deprecated `DockerOperator` operator arguments in `MappedOperator` (#38379) add d820f13e68 Use `relative-imports (TID252)` instead of `pygrep` rule (#38428) No new revisions were added by this update. Summary of changes: .pre-commit-config.yaml| 8 -- contributing-docs/08_static_code_checks.rst| 2 - dev/breeze/doc/images/output_static-checks.svg | 130 ++--- dev/breeze/doc/images/output_static-checks.txt | 2 +- dev/breeze/src/airflow_breeze/pre_commit_ids.py| 1 - docs/exts/docs_build/docs_builder.py | 8 +- pyproject.toml | 5 +- tests/providers/amazon/aws/hooks/test_eks.py | 5 +- tests/providers/amazon/aws/utils/eks_test_utils.py | 2 +- .../cncf/kubernetes/operators/test_pod.py | 12 +- .../cncf/kubernetes/utils/test_pod_manager.py | 3 +- .../elasticsearch/log/elasticmock/__init__.py | 2 +- .../log/elasticmock/fake_elasticsearch.py | 6 +- .../elasticsearch/log/test_es_task_handler.py | 5 +- 14 files changed, 91 insertions(+), 100 deletions(-)
Re: [PR] Use `relative-imports (TID252)` instead of `pygrep` rule [airflow]
potiuk merged PR #38428: URL: https://github.com/apache/airflow/pull/38428 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch main updated: Fix deprecated `DockerOperator` operator arguments in `MappedOperator` (#38379)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a commit to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/main by this push: new 817900 Fix deprecated `DockerOperator` operator arguments in `MappedOperator` (#38379) 817900 is described below commit 81790029fe4e5102b25fce002e2e0f3c69db Author: Andrey Anshin AuthorDate: Sun Mar 24 20:36:27 2024 +0400 Fix deprecated `DockerOperator` operator arguments in `MappedOperator` (#38379) --- airflow/providers/docker/operators/docker.py| 7 +++- tests/providers/docker/operators/test_docker.py | 56 + 2 files changed, 61 insertions(+), 2 deletions(-) diff --git a/airflow/providers/docker/operators/docker.py b/airflow/providers/docker/operators/docker.py index 2802b9aea7..1d5320d120 100644 --- a/airflow/providers/docker/operators/docker.py +++ b/airflow/providers/docker/operators/docker.py @@ -44,6 +44,7 @@ from airflow.providers.docker.exceptions import ( DockerContainerFailedSkipException, ) from airflow.providers.docker.hooks.docker import DockerHook +from airflow.utils.types import NOTSET, ArgNotSet if TYPE_CHECKING: from docker import APIClient @@ -240,9 +241,11 @@ class DockerOperator(BaseOperator): skip_on_exit_code: int | Container[int] | None = None, port_bindings: dict | None = None, ulimits: list[Ulimit] | None = None, +# deprecated, no need to include into docstring +skip_exit_code: int | Container[int] | ArgNotSet = NOTSET, **kwargs, ) -> None: -if skip_exit_code := kwargs.pop("skip_exit_code", None): +if skip_exit_code is not NOTSET: warnings.warn( "`skip_exit_code` is deprecated and will be removed in the future. " "Please use `skip_on_exit_code` instead.", @@ -255,7 +258,7 @@ class DockerOperator(BaseOperator): f"skip_on_exit_code={skip_on_exit_code!r}, skip_exit_code={skip_exit_code!r}." ) raise ValueError(msg) -skip_on_exit_code = skip_exit_code +skip_on_exit_code = skip_exit_code # type: ignore[assignment] if isinstance(auto_remove, bool): warnings.warn( "bool value for `auto_remove` is deprecated and will be removed in the future. " diff --git a/tests/providers/docker/operators/test_docker.py b/tests/providers/docker/operators/test_docker.py index 943ff24eb9..6b4d196ad5 100644 --- a/tests/providers/docker/operators/test_docker.py +++ b/tests/providers/docker/operators/test_docker.py @@ -29,6 +29,7 @@ from docker.types import DeviceRequest, LogConfig, Mount, Ulimit from airflow.exceptions import AirflowException, AirflowProviderDeprecationWarning, AirflowSkipException from airflow.providers.docker.exceptions import DockerContainerFailedException from airflow.providers.docker.operators.docker import DockerOperator +from airflow.utils.task_instance_session import set_current_task_instance_session TEST_CONN_ID = "docker_test_connection" TEST_DOCKER_URL = "unix://var/run/docker.test.sock" @@ -807,3 +808,58 @@ class TestDockerOperator: monkeypatch.delenv("DOCKER_HOST", raising=False) operator = DockerOperator(task_id="test", image="test") assert operator.docker_url == "unix://var/run/docker.sock" + +@pytest.mark.db_test +@pytest.mark.parametrize( +"skip_exit_code, skip_on_exit_code, expected", +[ +pytest.param(101, None, [101], id="skip-on-exit-code-not-set"), +pytest.param(102, 102, [102], id="skip-on-exit-code-same"), +], +) +def test_partial_deprecated_skip_exit_code( +self, skip_exit_code, skip_on_exit_code, expected, dag_maker, session +): +with dag_maker(dag_id="test_partial_deprecated_skip_exit_code", session=session): +DockerOperator.partial( +task_id="fake-task-id", +skip_exit_code=skip_exit_code, +skip_on_exit_code=skip_on_exit_code, +).expand(image=["test", "apache/airflow"]) + +dr = dag_maker.create_dagrun() +tis = dr.get_task_instances(session=session) +with set_current_task_instance_session(session=session): +warning_match = r"`skip_exit_code` is deprecated and will be removed" +for ti in tis: +with pytest.warns(AirflowProviderDeprecationWarning, match=warning_match): +ti.render_templates() +assert ti.task.skip_on_exit_code == expected + +@pytest.mark.db_test +@pytest.mark.parametrize( +"skip_exit_code, skip_on_exit_code", +[ +pytest.param(103, 0, id="skip-on-exit-code-zero"), +pytest.param(104, 105, id="skip-on-exit-code-not-same"), +], +) +def
Re: [PR] Fix deprecated `DockerOperator` operator arguments in `MappedOperator` [airflow]
potiuk merged PR #38379: URL: https://github.com/apache/airflow/pull/38379 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch main updated (0cc410b5b8 -> d3e9229105)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git from 0cc410b5b8 Updat build dependencies after hatchling update (#38439) add d3e9229105 Bugfix/update screenshots (#38438) No new revisions were added by this update. Summary of changes: airflow/example_dags/tutorial.py | 2 +- docs/apache-airflow/core-concepts/dag-run.rst | 8 docs/apache-airflow/img/add-dag-tags.png | Bin 68537 -> 54332 bytes docs/apache-airflow/img/basic-dag.png | Bin 5393 -> 12837 bytes docs/apache-airflow/img/branch_note.png| Bin 31771 -> 31259 bytes docs/apache-airflow/img/branch_with_trigger.png| Bin 34081 -> 28541 bytes docs/apache-airflow/img/branch_without_trigger.png | Bin 39815 -> 28914 bytes docs/apache-airflow/img/calendar.png | Bin 58726 -> 34689 bytes .../default_instance_name_configuration.png| Bin 154888 -> 68441 bytes .../example_instance_name_configuration.png| Bin 154915 -> 68231 bytes .../img/change-ui-colors/dags-page-new.png | Bin 483599 -> 74143 bytes .../img/change-ui-colors/dags-page-old.png | Bin 493009 -> 72288 bytes .../img/change-ui-colors/graph-view-new.png| Bin 56973 -> 34492 bytes .../img/change-ui-colors/graph-view-old.png| Bin 54884 -> 34721 bytes .../img/change-ui-colors/tree-view-new.png | Bin 36934 -> 20188 bytes .../img/change-ui-colors/tree-view-old.png | Bin 21601 -> 20146 bytes docs/apache-airflow/img/context.png| Bin 124467 -> 0 bytes docs/apache-airflow/img/dag_doc.png| Bin 40094 -> 93757 bytes docs/apache-airflow/img/edge_label_example.png | Bin 24592 -> 33270 bytes docs/apache-airflow/img/example_passing_conf.png | Bin 41080 -> 53383 bytes .../img/latest_only_with_trigger.png | Bin 42887 -> 59580 bytes docs/apache-airflow/img/mapping-simple-graph.png | Bin 13312 -> 7731 bytes docs/apache-airflow/img/mapping-simple-grid.png| Bin 179670 -> 93751 bytes docs/apache-airflow/img/nested_branching.png | Bin 31847 -> 0 bytes docs/apache-airflow/img/scheduler_loop.jpg | Bin 46864 -> 0 bytes .../img/smart_sensor_architecture.png | Bin 80325 -> 0 bytes .../img/smart_sensor_single_task_execute_flow.png | Bin 75462 -> 0 bytes docs/apache-airflow/img/task_doc.png | Bin 245714 -> 145331 bytes docs/apache-airflow/img/tutorial-pipeline-1.png| Bin 566225 -> 66411 bytes docs/apache-airflow/img/tutorial-pipeline-2.png| Bin 345529 -> 137595 bytes docs/apache-airflow/img/ui-alert-message.png | Bin 7909 -> 48753 bytes docs/apache-airflow/img/watcher.png| Bin 41592 -> 20310 bytes docs/apache-airflow/tutorial/pipeline.rst | 10 +- 33 files changed, 14 insertions(+), 6 deletions(-) mode change 100755 => 100644 docs/apache-airflow/img/basic-dag.png delete mode 100644 docs/apache-airflow/img/context.png mode change 100755 => 100644 docs/apache-airflow/img/edge_label_example.png delete mode 100644 docs/apache-airflow/img/nested_branching.png delete mode 100644 docs/apache-airflow/img/scheduler_loop.jpg delete mode 100644 docs/apache-airflow/img/smart_sensor_architecture.png delete mode 100644 docs/apache-airflow/img/smart_sensor_single_task_execute_flow.png
Re: [PR] Bugfix/update screenshots [airflow]
potiuk merged PR #38438: URL: https://github.com/apache/airflow/pull/38438 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Turn optional-dependencies in pyproject.toml into dynamic property [airflow]
potiuk commented on code in PR #38437: URL: https://github.com/apache/airflow/pull/38437#discussion_r1536854627 ## airflow_pre_installed_providers.txt: ## @@ -1,7 +1,7 @@ # List of all the providers that are pre-installed when you run `pip install apache-airflow` without extras common.io common.sql -fab>=1.0.2dev0 +fab>=1.0.2dev1 Review Comment: I needed it for UV installation. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Turn optional-dependencies in pyproject.toml into dynamic property [airflow]
potiuk commented on PR #38437: URL: https://github.com/apache/airflow/pull/38437#issuecomment-2016858378 cc: @uranusjr -> I'd appreciate your comments here, I believe now we should be **really** compliant with the PEP expectations. I compared the generated METADATA in the .whl files and they are very similar (some duplicate entrires removed, and I also got rid of the `devel-ci` from .whl - it's not needed there at all after using `--editable` install from downloaded package in the CI Dockerfile. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch turn-optional-dependencies-in-dynamic-metadata updated (db109c8372 -> 8343dabf24)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch turn-optional-dependencies-in-dynamic-metadata in repository https://gitbox.apache.org/repos/asf/airflow.git discard db109c8372 Turn optional-dependencies in pyproject.toml into dynamic property add 8343dabf24 Turn optional-dependencies in pyproject.toml into dynamic property This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (db109c8372) \ N -- N -- N refs/heads/turn-optional-dependencies-in-dynamic-metadata (8343dabf24) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: dev/breeze/README.md | 2 +- dev/breeze/pyproject.toml | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-)
Re: [PR] Turn optional-dependencies in pyproject.toml into dynamic property [airflow]
potiuk commented on PR #38437: URL: https://github.com/apache/airflow/pull/38437#issuecomment-2016854440 After some recent (and looking and old) discussions, it turns out that we need one more thing to be fully PEP-517 compliant. While hatchling allows us (for now) to dynamically modiy project attributes that are "static" in `pyproject.toml` - this is not what it is supposed to happen, because various tools might take such dependencies directly from pyproject.toml if they are not marked as dynamic (and will not run the build hook to retrieve the dynamic values for those properties). So we either generate the requiremets/optional dependencies dynamically, or we describe it statically in pyproject.toml, but we should not combine the two cases. Comment describing it is here: https://github.com/pypa/pip/issues/11440#issuecomment-1774064882 And I realized that after this discussion in this discussion in hatch https://github.com/pypa/hatch/issues/1331 but also this comment in uv https://github.com/astral-sh/uv/issues/2475#issuecomment-2002003895. Seems that what we do is a hack. While it is convenient to keep dynamic requirements and optional dependencies in `pyproject.toml`, it seems that it is non-standard and can behave differently for different ways of installing airflow (from wheel file, sdist, url - depending what different tools are doing) - for one now `pip` installs current airflow "properly" from URL, while UV uses only pyproject.toml. This had no impact on released airflow in wheel package, because wheel keeps dependenceis in METADATA, and this is where tools are getting the dependencies. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] changing dag_processing.processes from UpDownCounter to guage [airflow]
Bowrna closed pull request #38400: changing dag_processing.processes from UpDownCounter to guage URL: https://github.com/apache/airflow/pull/38400 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] heartbeat recovery message [airflow]
Bowrna commented on code in PR #34457: URL: https://github.com/apache/airflow/pull/34457#discussion_r1536847700 ## airflow/jobs/job.py: ## @@ -131,22 +146,20 @@ def executors(self): def heartrate(self) -> float: return Job._heartrate(self.job_type) -def is_alive(self, grace_multiplier=2.1) -> bool: +def is_alive(self) -> bool: """ Is this job currently alive. We define alive as in a state of RUNNING, and having sent a heartbeat within a multiple of the heartrate (default of 2.1) - -:param grace_multiplier: multiplier of heartrate to require heart beat -within """ +threshold_value = health_check_threshold( +self.job_type, self.heartrate, self.grace_multiplier Review Comment: hmm i tried other way too.. let me check it again @potiuk -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch turn-optional-dependencies-in-dynamic-metadata updated (ad76bf579b -> db109c8372)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch turn-optional-dependencies-in-dynamic-metadata in repository https://gitbox.apache.org/repos/asf/airflow.git discard ad76bf579b Turn optional-dependencies in pyproject.toml into dynamic property add 0cc410b5b8 Updat build dependencies after hatchling update (#38439) add db109c8372 Turn optional-dependencies in pyproject.toml into dynamic property This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (ad76bf579b) \ N -- N -- N refs/heads/turn-optional-dependencies-in-dynamic-metadata (db109c8372) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: pyproject.toml | 1 - 1 file changed, 1 deletion(-)
(airflow) branch main updated: Updat build dependencies after hatchling update (#38439)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a commit to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git The following commit(s) were added to refs/heads/main by this push: new 0cc410b5b8 Updat build dependencies after hatchling update (#38439) 0cc410b5b8 is described below commit 0cc410b5b86e851d6b18f391ba0b4a3a5d696ee8 Author: Jarek Potiuk AuthorDate: Sun Mar 24 16:51:41 2024 +0100 Updat build dependencies after hatchling update (#38439) --- pyproject.toml | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/pyproject.toml b/pyproject.toml index 643636ae2a..751173f7c4 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -24,12 +24,11 @@ requires = [ "GitPython==3.1.42", "gitdb==4.0.11", -"hatchling==1.22.3", +"hatchling==1.22.4", "packaging==24.0", "pathspec==0.12.1", "pluggy==1.4.0", "smmap==5.0.1", -"tomli==2.0.1; python_version < '3.11'", "trove-classifiers==2024.3.3", ] build-backend = "hatchling.build"
Re: [PR] Update build dependencies after hatchling update [airflow]
potiuk merged PR #38439: URL: https://github.com/apache/airflow/pull/38439 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch turn-optional-dependencies-in-dynamic-metadata updated (e2a3b9cfe9 -> ad76bf579b)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch turn-optional-dependencies-in-dynamic-metadata in repository https://gitbox.apache.org/repos/asf/airflow.git discard e2a3b9cfe9 Turn optional-dependencies in pyproject.toml into dynamic property add ad76bf579b Turn optional-dependencies in pyproject.toml into dynamic property This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (e2a3b9cfe9) \ N -- N -- N refs/heads/turn-optional-dependencies-in-dynamic-metadata (ad76bf579b) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: Dockerfile | 2 +- Dockerfile.ci | 2 +- scripts/docker/install_airflow_dependencies_from_branch_tip.sh | 2 +- 3 files changed, 3 insertions(+), 3 deletions(-)
(airflow) branch turn-optional-dependencies-in-dynamic-metadata updated (4601d60e83 -> e2a3b9cfe9)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch turn-optional-dependencies-in-dynamic-metadata in repository https://gitbox.apache.org/repos/asf/airflow.git omit 4601d60e83 Turn optional-dependencies in pyproject.toml into dynamic property add e2a3b9cfe9 Turn optional-dependencies in pyproject.toml into dynamic property This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (4601d60e83) \ N -- N -- N refs/heads/turn-optional-dependencies-in-dynamic-metadata (e2a3b9cfe9) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: Dockerfile| 11 --- Dockerfile.ci | 11 --- dev/breeze/src/airflow_breeze/utils/selective_checks.py | 2 +- hatch_build.py| 5 - .../docker/install_airflow_dependencies_from_branch_tip.sh| 11 --- 5 files changed, 25 insertions(+), 15 deletions(-)
[PR] Updat build dependencies after hatchling update [airflow]
potiuk opened a new pull request, #38439: URL: https://github.com/apache/airflow/pull/38439 --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] Bugfix/update screenshots [airflow]
potiuk commented on PR #38438: URL: https://github.com/apache/airflow/pull/38438#issuecomment-2016834179 NICE -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch turn-optional-dependencies-in-dynamic-metadata updated (1e079111ae -> 4601d60e83)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch turn-optional-dependencies-in-dynamic-metadata in repository https://gitbox.apache.org/repos/asf/airflow.git discard 1e079111ae Turn optional-dependencies in pyproject.toml into dynamic property add 4601d60e83 Turn optional-dependencies in pyproject.toml into dynamic property This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (1e079111ae) \ N -- N -- N refs/heads/turn-optional-dependencies-in-dynamic-metadata (4601d60e83) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: INSTALL| 40 +++- .../12_airflow_dependencies_and_extras.rst | 39 ++- dev/breeze/tests/test_selective_checks.py | 49 ++ pyproject.toml | 76 -- scripts/ci/pre_commit/common_precommit_utils.py| 16 - scripts/ci/pre_commit/pre_commit_insert_extras.py | 18 ++--- 6 files changed, 163 insertions(+), 75 deletions(-)
[PR] Bugfix/update screenshots [airflow]
jscheffl opened a new pull request, #38438: URL: https://github.com/apache/airflow/pull/38438 Following my PR #38435 I needed to realize that a lot of other images and screen captures are totally outdated to current UI status. This PR updates a couple/most of them. I was careful not updating all, hoping that the PR from @bbovenzi --> #37988 is merged, then a few other screen-captures will need an update in general if the headers are changing. Side effect: Found some images which were totally outdated and un-referenced in code. Also adjusted the tutorial DAG to use underscore and not a dash for naming consistency. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
(airflow) branch turn-optional-dependencies-in-dynamic-metadata updated (0ba4f701c1 -> 1e079111ae)
This is an automated email from the ASF dual-hosted git repository. potiuk pushed a change to branch turn-optional-dependencies-in-dynamic-metadata in repository https://gitbox.apache.org/repos/asf/airflow.git discard 0ba4f701c1 Turn optional-dependencies in pyproject.toml into dynamic property add 1e079111ae Turn optional-dependencies in pyproject.toml into dynamic property This update added new revisions after undoing existing revisions. That is to say, some revisions that were in the old version of the branch are not in the new version. This situation occurs when a user --force pushes a change and generates a repository containing something like this: * -- * -- B -- O -- O -- O (0ba4f701c1) \ N -- N -- N refs/heads/turn-optional-dependencies-in-dynamic-metadata (1e079111ae) You should already have received notification emails for all of the O revisions, and so the following emails describe only the N revisions from the common base, B. Any revisions marked "omit" are not gone; other references still refer to them. Any revisions marked "discard" are gone forever. No new revisions were added by this update. Summary of changes: INSTALL| 118 ++--- .../12_airflow_dependencies_and_extras.rst | 80 -- hatch_build.py | 30 +++--- scripts/ci/pre_commit/common_precommit_utils.py| 15 ++- scripts/ci/pre_commit/pre_commit_insert_extras.py | 64 ++- 5 files changed, 183 insertions(+), 124 deletions(-)
[PR] Turn optional-dependencies in pyproject.toml into dynamic property [airflow]
potiuk opened a new pull request, #38437: URL: https://github.com/apache/airflow/pull/38437 While currently hatchling and pip nicely supports dynamic replacement of the dependencies even if they are statically defined, this is not proper according to EP 621. When property of the project is set to be dynamic, it also contains static values. It's either static or dynamic. This is not a problem for wheel packages when installed, by any standard tool, because the wheel package has all the metadata added to wheel (and does not contain pyproject.toml) but in various cases (such as installing airflow via Github URL or from sdist, it can make a difference - depending whether the tool installing airflow will use directly pyproject.toml for optimization, or whether it will run build hooks to prepare the dependencies). This change makes all optional dependencies dynamici - rather than bake them in the pyproject.toml, we mark them as dynamic, so that any tool that uses pyproject.toml or sdist PKG-INFO will know that it has to run build hooks to get the actual optional dependencies. There are a few consequences of that: * our pyproject.toml will not contain automatically generated part - which is actually good, as it caused some confusion * all dynamic optional dependencies of ours are either present in hatch_build.py or calculated there - this is a bit new but sounds reasonable - and those dynamic dependencies are not really updated often, so thish is not an issue to maintain them there * the pre-commits that manage the optional dependencies got a lot simpler now - a lot of code has been removed. --- **^ Add meaningful description above** Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pull-request-guidelines)** for more information. In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals)) is needed. In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x). In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org