Re: [PR] Enable none default ftp port [airflow]

2024-04-27 Thread via GitHub


kerlion commented on PR #39048:
URL: https://github.com/apache/airflow/pull/39048#issuecomment-2081334434

   > Can you please add unit test to cover this change?
   
   Sorry, I do know know how to add unit test, but I tested it in my PROD env.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[PR] add deferrable support to `DatabricksNotebookOperator` [airflow]

2024-04-27 Thread via GitHub


rawwar opened a new pull request, #39295:
URL: https://github.com/apache/airflow/pull/39295

   related: #39178 
   
   This PR intends to make `DatabricksNotebookOperator` deferrable


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [I] Update Databricks provider to depend on databricks-sql-connector >= 3.1.0 [airflow]

2024-04-27 Thread via GitHub


sunchao commented on issue #39274:
URL: https://github.com/apache/airflow/issues/39274#issuecomment-2081272243

   Thanks @Taragolis ! Yes, looks like this depends on 
https://github.com/apache/airflow/issues/28723


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Don't clear downstream tasks when marked status is failure and downstream is True [airflow]

2024-04-27 Thread via GitHub


github-actions[bot] closed pull request #23079: Don't clear downstream tasks 
when marked status is failure and downstream is True
URL: https://github.com/apache/airflow/pull/23079


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[PR] fix: sqa deprecations for airflow providers [airflow]

2024-04-27 Thread via GitHub


dondaum opened a new pull request, #39293:
URL: https://github.com/apache/airflow/pull/39293

   
   
   
   
   related: #28723
   
   fix deprecations for SQLAlchemy 2.0 for Airflow providers.
   
   **FAB**
   SQLAlchemy 2.0 is changing the behavior when an object is being merged into 
a Session along the backref cascade. The User needs to be present in the 
session before it is merged with backref cascade for relationship Role.user.
   
   **Openlineage**
   Removed the sqlalchemy_engine completely. First of all in SQLAlchemy 2.0 the 
ability to associate an Engine with MetaData object is 
[removed](https://docs.sqlalchemy.org/en/20/changelog/migration_20.html#implicit-and-connectionless-execution-bound-metadata-removed),
 second IMO the sqlalchemy_engine is not used at all in the function. It is 
passed to MetaData but not used afterwards.
   
   
   ### Reported in providers
   
   - [x] 
[airflow/providers/openlineage/utils/sql.py:152](https://github.com/apache/airflow/blob/15eedd080428314cf6e27443f91624459d17a77d/airflow/providers/openlineage/utils/sql.py#L152)
   - [x] 
[airflow/providers/fab/auth_manager/security_manager/override.py:1509](https://github.com/apache/airflow/blob/15eedd080428314cf6e27443f91624459d17a77d/airflow/providers/fab/auth_manager/security_manager/override.py#L1509)
   
   
   
   ---
   **^ Add meaningful description above**
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a 
newsfragment file, named `{pr_number}.significant.rst` or 
`{issue_number}.significant.rst`, in 
[newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [I] Create a Cloud Storage Operator that could return a list of objects in a folder [airflow]

2024-04-27 Thread via GitHub


RNHTTR commented on issue #39290:
URL: https://github.com/apache/airflow/issues/39290#issuecomment-2081158158

   You can just edit this Issue to request that this Operator be added to the 
list of operators in the docs, I think.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Warn users when viewing docs for an older version of Airflow [airflow-site]

2024-04-27 Thread via GitHub


RNHTTR closed pull request #872: Warn users when viewing docs for an older 
version of Airflow
URL: https://github.com/apache/airflow-site/pull/872


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Warn users when viewing docs for an older version of Airflow [airflow-site]

2024-04-27 Thread via GitHub


RNHTTR commented on PR #872:
URL: https://github.com/apache/airflow-site/pull/872#issuecomment-2081157504

   Nope :\


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(airflow) branch v2-9-test updated: Copy menu_item href for nav bar (#39282)

2024-04-27 Thread jedcunningham
This is an automated email from the ASF dual-hosted git repository.

jedcunningham pushed a commit to branch v2-9-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v2-9-test by this push:
 new cddaf23e59 Copy menu_item href for nav bar (#39282)
cddaf23e59 is described below

commit cddaf23e594aa6999421a39ba8cf718f884de9fe
Author: Brent Bovenzi 
AuthorDate: Sat Apr 27 12:21:24 2024 -0400

Copy menu_item href for nav bar (#39282)

Co-authored-by: Jed Cunningham 
(cherry picked from commit 25f901a963001377621abe0ac0a1ff121a042bcd)
---
 airflow/auth/managers/base_auth_manager.py| 10 --
 tests/auth/managers/test_base_auth_manager.py | 16 +++-
 2 files changed, 19 insertions(+), 7 deletions(-)

diff --git a/airflow/auth/managers/base_auth_manager.py 
b/airflow/auth/managers/base_auth_manager.py
index 44fc53a66e..86f0ebd6dc 100644
--- a/airflow/auth/managers/base_auth_manager.py
+++ b/airflow/auth/managers/base_auth_manager.py
@@ -398,12 +398,10 @@ class BaseAuthManager(LoggingMixin):
 accessible_items = []
 for menu_item in items:
 menu_item_copy = MenuItem(
-name=menu_item.name,
-icon=menu_item.icon,
-label=menu_item.label,
-childs=[],
-baseview=menu_item.baseview,
-cond=menu_item.cond,
+**{
+**menu_item.__dict__,
+"childs": [],
+}
 )
 if menu_item.childs:
 accessible_children = []
diff --git a/tests/auth/managers/test_base_auth_manager.py 
b/tests/auth/managers/test_base_auth_manager.py
index a39b60787c..7628924ad6 100644
--- a/tests/auth/managers/test_base_auth_manager.py
+++ b/tests/auth/managers/test_base_auth_manager.py
@@ -300,7 +300,15 @@ class TestBaseAuthManager:
 mock_security_manager.has_access.side_effect = [True, False, True, 
True, False]
 
 menu = Menu()
-menu.add_link("item1")
+menu.add_link(
+# These may not all be valid types, but it does let us check each 
attr is copied
+name="item1",
+href="h1",
+icon="i1",
+label="l1",
+baseview="b1",
+cond="c1",
+)
 menu.add_link("item2")
 menu.add_link("item3")
 menu.add_link("item3.1", category="item3")
@@ -313,6 +321,12 @@ class TestBaseAuthManager:
 assert result[1].name == "item3"
 assert len(result[1].childs) == 1
 assert result[1].childs[0].name == "item3.1"
+# check we've copied every attr
+assert result[0].href == "h1"
+assert result[0].icon == "i1"
+assert result[0].label == "l1"
+assert result[0].baseview == "b1"
+assert result[0].cond == "c1"
 
 @patch.object(EmptyAuthManager, "security_manager")
 def test_filter_permitted_menu_items_twice(self, mock_security_manager, 
auth_manager):



(airflow) branch main updated: Copy menu_item href for nav bar (#39282)

2024-04-27 Thread jedcunningham
This is an automated email from the ASF dual-hosted git repository.

jedcunningham pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 25f901a963 Copy menu_item href for nav bar (#39282)
25f901a963 is described below

commit 25f901a963001377621abe0ac0a1ff121a042bcd
Author: Brent Bovenzi 
AuthorDate: Sat Apr 27 12:21:24 2024 -0400

Copy menu_item href for nav bar (#39282)

Co-authored-by: Jed Cunningham 
---
 airflow/auth/managers/base_auth_manager.py| 10 --
 tests/auth/managers/test_base_auth_manager.py | 16 +++-
 2 files changed, 19 insertions(+), 7 deletions(-)

diff --git a/airflow/auth/managers/base_auth_manager.py 
b/airflow/auth/managers/base_auth_manager.py
index 44fc53a66e..86f0ebd6dc 100644
--- a/airflow/auth/managers/base_auth_manager.py
+++ b/airflow/auth/managers/base_auth_manager.py
@@ -398,12 +398,10 @@ class BaseAuthManager(LoggingMixin):
 accessible_items = []
 for menu_item in items:
 menu_item_copy = MenuItem(
-name=menu_item.name,
-icon=menu_item.icon,
-label=menu_item.label,
-childs=[],
-baseview=menu_item.baseview,
-cond=menu_item.cond,
+**{
+**menu_item.__dict__,
+"childs": [],
+}
 )
 if menu_item.childs:
 accessible_children = []
diff --git a/tests/auth/managers/test_base_auth_manager.py 
b/tests/auth/managers/test_base_auth_manager.py
index a39b60787c..7628924ad6 100644
--- a/tests/auth/managers/test_base_auth_manager.py
+++ b/tests/auth/managers/test_base_auth_manager.py
@@ -300,7 +300,15 @@ class TestBaseAuthManager:
 mock_security_manager.has_access.side_effect = [True, False, True, 
True, False]
 
 menu = Menu()
-menu.add_link("item1")
+menu.add_link(
+# These may not all be valid types, but it does let us check each 
attr is copied
+name="item1",
+href="h1",
+icon="i1",
+label="l1",
+baseview="b1",
+cond="c1",
+)
 menu.add_link("item2")
 menu.add_link("item3")
 menu.add_link("item3.1", category="item3")
@@ -313,6 +321,12 @@ class TestBaseAuthManager:
 assert result[1].name == "item3"
 assert len(result[1].childs) == 1
 assert result[1].childs[0].name == "item3.1"
+# check we've copied every attr
+assert result[0].href == "h1"
+assert result[0].icon == "i1"
+assert result[0].label == "l1"
+assert result[0].baseview == "b1"
+assert result[0].cond == "c1"
 
 @patch.object(EmptyAuthManager, "security_manager")
 def test_filter_permitted_menu_items_twice(self, mock_security_manager, 
auth_manager):



Re: [PR] Copy menu_item href for nav bar [airflow]

2024-04-27 Thread via GitHub


jedcunningham merged PR #39282:
URL: https://github.com/apache/airflow/pull/39282


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(airflow) branch main updated (c946fc3f0b -> 04caa6eceb)

2024-04-27 Thread eladkal
This is an automated email from the ASF dual-hosted git repository.

eladkal pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from c946fc3f0b `TriggerDagRunOperator` depreacte `exection_date` in favor 
of `logical_date` (#39285)
 add 04caa6eceb Update docstring `LivyOperator` retry_args and deferrable 
docs (#39266)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/apache/livy/operators/livy.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



Re: [PR] Update docstring `LivyOperator` retry_args and deferrable docs [airflow]

2024-04-27 Thread via GitHub


boring-cyborg[bot] commented on PR #39266:
URL: https://github.com/apache/airflow/pull/39266#issuecomment-2081003609

   Awesome work, congrats on your first merged pull request! You are invited to 
check our [Issue Tracker](https://github.com/apache/airflow/issues) for 
additional contributions.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Update docstring `LivyOperator` retry_args and deferrable docs [airflow]

2024-04-27 Thread via GitHub


eladkal merged PR #39266:
URL: https://github.com/apache/airflow/pull/39266


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Expose AWS IAM missing param in Hashicorp secret [airflow]

2024-04-27 Thread via GitHub


ReadytoRocc commented on code in PR #38536:
URL: https://github.com/apache/airflow/pull/38536#discussion_r1581853068


##
airflow/providers/hashicorp/_internal_client/vault_client.py:
##
@@ -318,15 +321,36 @@ def _auth_azure(self, _client: hvac.Client) -> None:
 )
 
 def _auth_aws_iam(self, _client: hvac.Client) -> None:
-if self.auth_mount_point:
-_client.auth.aws.iam_login(
-access_key=self.key_id,
-secret_key=self.secret_id,
-role=self.role_id,
-mount_point=self.auth_mount_point,
-)
+if self.key_id and self.secret_id:
+auth_args = {
+"access_key": self.key_id,
+"secret_key": self.secret_id,
+"role": self.role_id,
+}
 else:
-_client.auth.aws.iam_login(access_key=self.key_id, 
secret_key=self.secret_id, role=self.role_id)
+import boto3
+
+if self.role_arn:
+sts_client = boto3.client("sts")
+credentials = sts_client.assume_role(RoleArn=self.role_arn, 
RoleSessionName="airflow")

Review Comment:
   Agreed. I am saying you no longer need a dedicated `role_arn` backend kwarg. 
It would be a part of `assume_role_kwargs`.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(airflow) branch main updated: `TriggerDagRunOperator` depreacte `exection_date` in favor of `logical_date` (#39285)

2024-04-27 Thread eladkal
This is an automated email from the ASF dual-hosted git repository.

eladkal pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new c946fc3f0b `TriggerDagRunOperator` depreacte `exection_date` in favor 
of `logical_date` (#39285)
c946fc3f0b is described below

commit c946fc3f0b4c55bd6fbf9a49950d6e24980b4abe
Author: Felipe Lolas 
AuthorDate: Sat Apr 27 11:46:54 2024 -0400

`TriggerDagRunOperator` depreacte `exection_date` in favor of 
`logical_date` (#39285)

* added logical_date parameter

* fix comment
---
 airflow/operators/trigger_dagrun.py|  62 --
 tests/operators/test_trigger_dagrun.py | 148 ++---
 2 files changed, 120 insertions(+), 90 deletions(-)

diff --git a/airflow/operators/trigger_dagrun.py 
b/airflow/operators/trigger_dagrun.py
index ab74d4c862..f8cfa5256a 100644
--- a/airflow/operators/trigger_dagrun.py
+++ b/airflow/operators/trigger_dagrun.py
@@ -20,6 +20,7 @@ from __future__ import annotations
 import datetime
 import json
 import time
+import warnings
 from typing import TYPE_CHECKING, Any, Sequence, cast
 
 from sqlalchemy import select
@@ -27,7 +28,7 @@ from sqlalchemy.orm.exc import NoResultFound
 
 from airflow.api.common.trigger_dag import trigger_dag
 from airflow.configuration import conf
-from airflow.exceptions import AirflowException, DagNotFound, 
DagRunAlreadyExists
+from airflow.exceptions import AirflowException, DagNotFound, 
DagRunAlreadyExists, RemovedInAirflow3Warning
 from airflow.models.baseoperator import BaseOperator
 from airflow.models.baseoperatorlink import BaseOperatorLink
 from airflow.models.dag import DagModel
@@ -41,7 +42,7 @@ from airflow.utils.session import provide_session
 from airflow.utils.state import DagRunState
 from airflow.utils.types import DagRunType
 
-XCOM_EXECUTION_DATE_ISO = "trigger_execution_date_iso"
+XCOM_LOGICAL_DATE_ISO = "trigger_logical_date_iso"
 XCOM_RUN_ID = "trigger_run_id"
 
 
@@ -64,7 +65,7 @@ class TriggerDagRunLink(BaseOperatorLink):
 def get_link(self, operator: BaseOperator, *, ti_key: TaskInstanceKey) -> 
str:
 # Fetch the correct execution date for the triggerED dag which is
 # stored in xcom during execution of the triggerING task.
-when = XCom.get_value(ti_key=ti_key, key=XCOM_EXECUTION_DATE_ISO)
+when = XCom.get_value(ti_key=ti_key, key=XCOM_LOGICAL_DATE_ISO)
 query = {"dag_id": cast(TriggerDagRunOperator, 
operator).trigger_dag_id, "base_date": when}
 return build_airflow_url_with_query(query)
 
@@ -77,7 +78,7 @@ class TriggerDagRunOperator(BaseOperator):
 :param trigger_run_id: The run ID to use for the triggered DAG run 
(templated).
 If not provided, a run ID will be automatically generated.
 :param conf: Configuration for the DAG run (templated).
-:param execution_date: Execution date for the dag (templated).
+:param logical_date: Logical date for the dag (templated).
 :param reset_dag_run: Whether clear existing dag run if already exists.
 This is useful when backfill or rerun an existing dag run.
 This only resets (not recreates) the dag run.
@@ -91,12 +92,13 @@ class TriggerDagRunOperator(BaseOperator):
 :param failed_states: List of failed or dis-allowed states, default is 
``None``.
 :param deferrable: If waiting for completion, whether or not to defer the 
task until done,
 default is ``False``.
+:param execution_date: Deprecated parameter; same as ``logical_date``.
 """
 
 template_fields: Sequence[str] = (
 "trigger_dag_id",
 "trigger_run_id",
-"execution_date",
+"logical_date",
 "conf",
 "wait_for_completion",
 )
@@ -110,13 +112,14 @@ class TriggerDagRunOperator(BaseOperator):
 trigger_dag_id: str,
 trigger_run_id: str | None = None,
 conf: dict | None = None,
-execution_date: str | datetime.datetime | None = None,
+logical_date: str | datetime.datetime | None = None,
 reset_dag_run: bool = False,
 wait_for_completion: bool = False,
 poke_interval: int = 60,
 allowed_states: list[str] | None = None,
 failed_states: list[str] | None = None,
 deferrable: bool = conf.getboolean("operators", "default_deferrable", 
fallback=False),
+execution_date: str | datetime.datetime | None = None,
 **kwargs,
 ) -> None:
 super().__init__(**kwargs)
@@ -136,20 +139,29 @@ class TriggerDagRunOperator(BaseOperator):
 self.failed_states = [DagRunState.FAILED]
 self._defer = deferrable
 
-if execution_date is not None and not isinstance(execution_date, (str, 
datetime.datetime)):
+if execution_date is not None:
+warnings.warn(
+"Parameter 'execution_date' is deprecated. Use 'logical_date' 

Re: [I] Align TriggerDagRunOperator with logical date terminology [airflow]

2024-04-27 Thread via GitHub


eladkal closed issue #26916: Align TriggerDagRunOperator with logical date 
terminology
URL: https://github.com/apache/airflow/issues/26916


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] added logical_date parameter [airflow]

2024-04-27 Thread via GitHub


eladkal merged PR #39285:
URL: https://github.com/apache/airflow/pull/39285


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [I] Bug in Extras in Add Connection , while writing few words are hidden due to margin and padding [airflow]

2024-04-27 Thread via GitHub


pateash commented on issue #37583:
URL: https://github.com/apache/airflow/issues/37583#issuecomment-2080822317

   @Ram-tripath could you please share some screenshots of the issue you are 
facing?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(airflow) branch v2-9-test updated: Move significant note for past release to release notes (#39283)

2024-04-27 Thread jedcunningham
This is an automated email from the ASF dual-hosted git repository.

jedcunningham pushed a commit to branch v2-9-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/v2-9-test by this push:
 new 368ff13c13 Move significant note for past release to release notes 
(#39283)
368ff13c13 is described below

commit 368ff13c13d33854e0da8940f1b9d77ff21c9a9c
Author: Jed Cunningham <66968678+jedcunning...@users.noreply.github.com>
AuthorDate: Sat Apr 27 09:56:48 2024 -0400

Move significant note for past release to release notes (#39283)

Once the release is out, these should be added directly to the release
notes - newsfragments are all about future releases!

(cherry picked from commit 8dfdc3a0a8aa4a12f0e0e3f6e0a6ff925646c201)
---
 RELEASE_NOTES.rst   | 8 
 airflow/reproducible_build.yaml | 4 ++--
 2 files changed, 10 insertions(+), 2 deletions(-)

diff --git a/RELEASE_NOTES.rst b/RELEASE_NOTES.rst
index 5533fad32c..1f323919ea 100644
--- a/RELEASE_NOTES.rst
+++ b/RELEASE_NOTES.rst
@@ -111,6 +111,14 @@ Xcom table column ``value`` type has changed from ``blob`` 
to ``longblob``. This
 
 To downgrade from revision: ``b4078ac230a1``, ensure that you don't have Xcom 
values larger than 65,535 bytes. Otherwise, you'll need to clean those rows or 
run ``airflow db clean xcom`` to clean the Xcom table.
 
+Stronger validation for key parameter defaults in taskflow context variables 
(#38015)
+"
+
+As for the taskflow implementation in conjunction with context variable 
defaults invalid parameter orders can be
+generated, it is now not accepted anymore (and validated) that taskflow 
functions are defined with defaults
+other than ``None``. If you have done this before you most likely will see a 
broken DAG and a error message like
+``Error message: Context key parameter my_param can't have a default other 
than None``.
+
 New Features
 
 - Allow users to write dag_id and task_id in their national characters, added 
display name for dag / task (v2) (#38446)
diff --git a/airflow/reproducible_build.yaml b/airflow/reproducible_build.yaml
index fc6a3933e8..c6683aa2c0 100644
--- a/airflow/reproducible_build.yaml
+++ b/airflow/reproducible_build.yaml
@@ -1,2 +1,2 @@
-release-notes-hash: 416d01241f2b6ed259e8d991fb7ac1f8
-source-date-epoch: 1712672348
+release-notes-hash: aad86522e49984ce17db1b8647cfb54a
+source-date-epoch: 1714165337



(airflow) branch main updated (60b17bb78a -> 8dfdc3a0a8)

2024-04-27 Thread jedcunningham
This is an automated email from the ASF dual-hosted git repository.

jedcunningham pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from 60b17bb78a Improve helm chart annotation building script (#39286)
 add 8dfdc3a0a8 Move significant note for past release to release notes 
(#39283)

No new revisions were added by this update.

Summary of changes:
 RELEASE_NOTES.rst   | 8 
 airflow/reproducible_build.yaml | 4 ++--
 newsfragments/38015.significant.rst | 6 --
 3 files changed, 10 insertions(+), 8 deletions(-)
 delete mode 100644 newsfragments/38015.significant.rst



Re: [PR] Move significant newsfragment for 2.9.0 to release notes [airflow]

2024-04-27 Thread via GitHub


jedcunningham merged PR #39283:
URL: https://github.com/apache/airflow/pull/39283


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(airflow) branch main updated (48c98bc1d4 -> 60b17bb78a)

2024-04-27 Thread jedcunningham
This is an automated email from the ASF dual-hosted git repository.

jedcunningham pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from 48c98bc1d4 Update Hashicorp AWS assume role auth docs (#39287)
 add 60b17bb78a Improve helm chart annotation building script (#39286)

No new revisions were added by this update.

Summary of changes:
 dev/chart/build_changelog_annotations.py | 11 ---
 1 file changed, 8 insertions(+), 3 deletions(-)



Re: [PR] Improve helm chart annotation building script [airflow]

2024-04-27 Thread via GitHub


jedcunningham merged PR #39286:
URL: https://github.com/apache/airflow/pull/39286


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [I] KubernetesPodOperator callback example from Doc doesn't work [airflow]

2024-04-27 Thread via GitHub


boring-cyborg[bot] commented on issue #39291:
URL: https://github.com/apache/airflow/issues/39291#issuecomment-2080656538

   Thanks for opening your first issue here! Be sure to follow the issue 
template! If you are willing to raise PR to address this issue please do so, no 
need to wait for approval.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[I] KubernetesPodOperator callback example from Doc doesn't work [airflow]

2024-04-27 Thread via GitHub


owler opened a new issue, #39291:
URL: https://github.com/apache/airflow/issues/39291

   ### What do you see as an issue?
   
   
https://airflow.apache.org/docs/apache-airflow-providers-cncf-kubernetes/stable/operators.html#id13
   
   The problem may related to the  None value for api_version and kind  see. 
log below
   
   {code}
   import kubernetes.client as k8s
   import kubernetes_asyncio.client as async_k8s
   
   from airflow.providers.cncf.kubernetes.operators.pod import 
KubernetesPodOperator
   from airflow.providers.cncf.kubernetes.callbacks import 
KubernetesPodOperatorCallback
   import pendulum
   from airflow import DAG
   from airflow.operators.empty import EmptyOperator
   
   class MyCallback(KubernetesPodOperatorCallback):
   @staticmethod
   def on_pod_creation(*, pod: k8s.V1Pod, client: k8s.CoreV1Api, mode: str, 
**kwargs) -> None:
   client.create_namespaced_service(
   namespace=pod.metadata.namespace,
   body=k8s.V1Service(
   metadata=k8s.V1ObjectMeta(
   name=pod.metadata.name,
   labels=pod.metadata.labels,
   owner_references=[
   k8s.V1OwnerReference(
   api_version=pod.api_version,
   kind=pod.kind,
   name=pod.metadata.name,
   uid=pod.metadata.uid,
   controller=True,
   block_owner_deletion=True,
   )
   ],
   ),
   spec=k8s.V1ServiceSpec(
   selector=pod.metadata.labels,
   ports=[
   k8s.V1ServicePort(
   name="http",
   port=80,
   target_port=80,
   )
   ],
   ),
   ),
   )
   
   with DAG(
   dag_id='test_dag2',
   schedule="45 * * * *",
   start_date=pendulum.datetime(2024, 3, 26, tz="UTC"),
   catchup=False,
   max_active_runs=3,
   dagrun_timeout=None,
   params={
   "srcPath": "/dimas/test_dataset",
   "partition": "",
   "dstPath": "/shared/dmitry.savenko/kube"
   }
   ) as dag:
   
   k = KubernetesPodOperator(
   task_id="test_callback",
   image="alpine",
   cmds=["/bin/sh"],
   arguments=["-c", "echo hello world; echo Custom error > 
/dev/termination-log; exit 1;"],
   name="test-callback",
   callbacks=MyCallback,
   )
   run_this_last = EmptyOperator(
   task_id="run_this_last",
   )
   
   k >> run_this_last
   {code}
   
   
   {code}
File 
"/home/airflow/.local/lib/python3.8/site-packages/kubernetes/client/models/v1_owner_reference.py",
 line 97, in api_version
   raise ValueError("Invalid value for `api_version`, must not be `None`")  
# noqa: E501
   
   
   airflow.exceptions.AirflowException: Pod test-callback-lvmxwlac returned a 
failure.
   remote_pod: {'api_version': None,
'kind': None,
'metadata': {'annotations': None,
 'creation_timestamp': datetime.datetime(2024, 4, 27, 13, 13, 
25, tzinfo=tzlocal()),
 'deletion_grace_period_seconds': None,
 'deletion_timestamp': None,
 'finalizers': None,
 'generate_name': None,
 'generation': None,
 'labels': {'airflow_kpo_in_cluster': 'True',
'airflow_version': '2.8.4',
'dag_id': 'test_dag2',
'kubernetes_pod_operator': 'True',
'run_id': 
'scheduled__2024-04-27T114500-ac155a96d',
'task_id': 'test_callback',
   {code}
   
   ### Solving the problem
   
   _No response_
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] fix: use `sqlalchemy_url` property in `get_uri` for postgresql provider [airflow]

2024-04-27 Thread via GitHub


rawwar commented on code in PR #38831:
URL: https://github.com/apache/airflow/pull/38831#discussion_r1581814020


##
airflow/providers/postgres/hooks/postgres.py:
##
@@ -113,6 +114,18 @@ def schema(self):
 def schema(self, value):
 self.database = value
 
+@property
+def sqlalchemy_url(self) -> URL:
+conn = self.get_connection(getattr(self, self.conn_name_attr))
+return URL.create(
+drivername="postgresql",
+username=conn.login,
+password=conn.password,
+host=conn.host,
+port=conn.port,
+database=self.database or conn.schema,

Review Comment:
   @Taragolis, I'm wondering if I should be passing query value by considering 
all the data passed via the `extra` field. Or the current implementation is 
fine?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] fix: use `sqlalchemy_url` property in `get_uri` for postgresql provider [airflow]

2024-04-27 Thread via GitHub


rawwar commented on code in PR #38831:
URL: https://github.com/apache/airflow/pull/38831#discussion_r1581812721


##
airflow/providers/postgres/hooks/postgres.py:
##
@@ -113,6 +114,18 @@ def schema(self):
 def schema(self, value):
 self.database = value
 
+@property
+def sqlalchemy_url(self) -> URL:
+conn = self.get_connection(getattr(self, self.conn_name_attr))
+return URL.create(
+drivername="postgresql",

Review Comment:
   Since, different dialects require the specific driver package to be 
installed, for now, I'm defaulting to sqlalchemy



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [I] Create a Cloud Storage Operator that could return a list of objects in a folder [airflow]

2024-04-27 Thread via GitHub


lopezvit commented on issue #39290:
URL: https://github.com/apache/airflow/issues/39290#issuecomment-2080457087

   Ok, after a bit of googleing I found that it exists!!!
   
[GCSListObjectsOperator](https://airflow.apache.org/docs/apache-airflow-providers-google/stable/_api/airflow/providers/google/cloud/operators/gcs/index.html#airflow.providers.google.cloud.operators.gcs.GCSListObjectsOperator)
   Then, what I believe it is wrong is the documentation, because I didn't 
found it here:
   
https://airflow.apache.org/docs/apache-airflow-providers-google/stable/operators/cloud/gcs.html
   Should I create a new issue for fixing the documentation or can it continue 
here?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[I] Create a Cloud Storage Operator that could return a list of objects in a folder [airflow]

2024-04-27 Thread via GitHub


lopezvit opened a new issue, #39290:
URL: https://github.com/apache/airflow/issues/39290

   ### Description
   
   Create a new operator inside of 
`airflow.providers.google.cloud.operators.gcs` that, given a pattern/preffix 
would return a list 
   of files in said folder, similar to the client method:
   
   storage_client.get_bucket(BUCKET_NAME)
   bucket.list_blobs(prefix=filename)
   
   ### Use case/motivation
   
   _No response_
   
   ### Related issues
   
   I have a process that runs once a day that reads some *.csv files from 
storage and process them.
   It would be nice to have an operator that would do exactly that, without 
needing to create custom code for it.
   When I create the custom code, probably using the storage hook, I can try to 
paste it here, but I don't have time to create a PR.
   
   ### Are you willing to submit a PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [I] [Bug]: Papermill Provider installed via pip seems empty [airflow]

2024-04-27 Thread via GitHub


deramos commented on issue #39281:
URL: https://github.com/apache/airflow/issues/39281#issuecomment-2080436529

   @Taragolis the message I posted earlier shows that `papermill‘ doesn't exist 
in the airflow.providers namespace even after pip install. I think you may have 
missed it. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [I] Rendering custom map index before task is run. [airflow]

2024-04-27 Thread via GitHub


tomrutter commented on issue #39118:
URL: https://github.com/apache/airflow/issues/39118#issuecomment-2080433343

   Happy to be assigned, but if someone has a clear idea, happy for that to go 
ahead too.
   
   I’m guessing the big issue with rendering on the webserver is picking up the 
context added by the task. I’d start with that for now though with failover to 
rendered value from the task if that exists. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [I] KubernetesPodOperator duplicating logs when interrupted [airflow]

2024-04-27 Thread via GitHub


gbonazzoli commented on issue #39236:
URL: https://github.com/apache/airflow/issues/39236#issuecomment-2080427612

   @raphaelauv 
   
   with version 8.1.1 the problem is still present. It seems that now is 
allways getting "_Pod docker-java-w2ade41b log read interrupted but container 
base still running_"
   
   Airflow's version:
   
   ```bash
   airflow@airflow-test-worker-6cb8744f69-sw7xg:/opt/airflow$ airflow version
   2.9.0
   
   airflow@airflow-test-worker-6cb8744f69-sw7xg:/opt/airflow$ pip list | grep 
kub
   apache-airflow-providers-cncf-kubernetes 8.1.1
   kubernetes   29.0.0
   kubernetes_asyncio   29.0.0
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Warn users when viewing docs for an older version of Airflow [airflow-site]

2024-04-27 Thread via GitHub


eladkal commented on PR #872:
URL: https://github.com/apache/airflow-site/pull/872#issuecomment-2080412865

   @RNHTTR are you still working on this?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(airflow) branch main updated: Update Hashicorp AWS assume role auth docs (#39287)

2024-04-27 Thread pankaj
This is an automated email from the ASF dual-hosted git repository.

pankaj pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 48c98bc1d4 Update Hashicorp AWS assume role auth docs (#39287)
48c98bc1d4 is described below

commit 48c98bc1d4371a990bfadbfdd1478254e0098fa6
Author: Pankaj Singh <98807258+pankajas...@users.noreply.github.com>
AuthorDate: Sat Apr 27 11:51:04 2024 +0530

Update Hashicorp AWS assume role auth docs (#39287)
---
 .../secrets-backends/hashicorp-vault.rst   | 7 ---
 1 file changed, 4 insertions(+), 3 deletions(-)

diff --git 
a/docs/apache-airflow-providers-hashicorp/secrets-backends/hashicorp-vault.rst 
b/docs/apache-airflow-providers-hashicorp/secrets-backends/hashicorp-vault.rst
index 3227b0ef58..6d0a5393b1 100644
--- 
a/docs/apache-airflow-providers-hashicorp/secrets-backends/hashicorp-vault.rst
+++ 
b/docs/apache-airflow-providers-hashicorp/secrets-backends/hashicorp-vault.rst
@@ -220,14 +220,15 @@ Add "verify": "absolute path to ca-certificate file"
 Vault authentication with AWS Assume Role STS
 "
 
-Add parameter "role_arn": "The AWS ARN of the role to assume"
+Add parameter "assume_role_kwargs": "The AWS STS assume role auth parameter 
dict"
+
+For more details, please refer to the AWS Assume Role Authentication 
documentation: 
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/sts/client/assume_role.html
 
 .. code-block:: ini
 
 [secrets]
 backend = airflow.providers.hashicorp.secrets.vault.VaultBackend
-backend_kwargs = {"connections_path": "airflow-connections", 
"variables_path": null, "mount_point": "airflow", "url": 
"http://127.0.0.1:8200;, "auth_type": "aws_iam", "role_arn": 
"arn:aws:iam::123456789000:role/hashicorp-aws-iam-role"}
-
+backend_kwargs = {"connections_path": "airflow-connections", 
"variables_path": null, "mount_point": "airflow", "url": 
"http://127.0.0.1:8200;, "auth_type": "aws_iam", "assume_role_kwargs": 
{"arn:aws:iam::123456789000:role/hashicorp-aws-iam-role", "RoleSessionName": 
"Airflow"}}
 
 Using multiple mount points
 """



Re: [PR] Update Hashicorp AWS assume role auth docs [airflow]

2024-04-27 Thread via GitHub


pankajastro merged PR #39287:
URL: https://github.com/apache/airflow/pull/39287


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Updates to Teradata Provider [airflow]

2024-04-27 Thread via GitHub


eladkal commented on code in PR #39217:
URL: https://github.com/apache/airflow/pull/39217#discussion_r1581740750


##
airflow/providers/teradata/transfers/azure_blob_to_teradata.py:
##
@@ -0,0 +1,105 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from typing import TYPE_CHECKING, Sequence
+
+from airflow.models import BaseOperator
+from airflow.providers.microsoft.azure.hooks.wasb import WasbHook
+from airflow.providers.teradata.hooks.teradata import TeradataHook
+
+if TYPE_CHECKING:
+from airflow.utils.context import Context
+
+
+class AzureBlobStorageToTeradataOperator(BaseOperator):
+"""
+
+Loads CSV, JSON and Parquet format data from Azure Blob Storage to 
Teradata.
+
+.. seealso::
+For more information on how to use this operator, take a look at the 
guide:
+:ref:`howto/operator:AzureBlobStorageToTeradataOperator`
+
+:param blob_source_key: The URI format specifying the location of the 
Azure blob object store.(templated)
+The URI format is 
`/az/YOUR-STORAGE-ACCOUNT.blob.core.windows.net/YOUR-CONTAINER/YOUR-BLOB-LOCATION`.
+Refer to
+
https://docs.teradata.com/search/documents?query=native+object+store=last_update=title_only=en-US
+:param azure_conn_id: The Airflow WASB connection used for azure blob 
credentials.
+:param teradata_table: The name of the teradata table to which the data is 
transferred.(templated)
+:param teradata_conn_id: The connection ID used to connect to Teradata
+:ref:`Teradata connection `
+
+Note that ``blob_source_key`` and ``teradata_table`` are
+templated, so you can use variables in them if you wish.
+"""
+
+template_fields: Sequence[str] = ("blob_source_key", "teradata_table")
+ui_color = "#e07c24"
+
+def __init__(
+self,
+*,
+blob_source_key: str,
+azure_conn_id: str = "azure_default",
+teradata_table: str,
+teradata_conn_id: str = "teradata_default",
+**kwargs,
+) -> None:
+super().__init__(**kwargs)
+self.blob_source_key = blob_source_key
+self.azure_conn_id = azure_conn_id
+self.teradata_table = teradata_table
+self.teradata_conn_id = teradata_conn_id
+
+def execute(self, context: Context) -> None:
+self.log.info(
+"transferring data from %s to teradata table %s...", 
self.blob_source_key, self.teradata_table
+)
+azure_hook = WasbHook(wasb_conn_id=self.azure_conn_id)
+conn = azure_hook.get_connection(self.azure_conn_id)
+# Obtaining the Azure client ID and Azure secret in order to access a 
specified Blob container
+access_id = conn.login if conn.login is not None else ""
+access_secret = conn.password if conn.password is not None else ""
+teradata_hook = TeradataHook(teradata_conn_id=self.teradata_conn_id)
+sql = f"""
+CREATE MULTISET TABLE {self.teradata_table}  AS
+(
+SELECT * FROM (
+LOCATION = '{self.blob_source_key}'
+ACCESS_ID= '{access_id}'
+ACCESS_KEY= '{access_secret}'
+) AS d
+) WITH DATA
+"""
+try:
+teradata_hook.run(sql, True)
+except Exception as ex:
+# Handling permission issue errors
+if "Error 3524" in str(ex):
+self.log.error("The user does not have CREATE TABLE access in 
teradata")
+raise
+if "Error 9134" in str(ex):
+self.log.error(
+"There is an issue with the transfer operation. Please 
validate azure and "
+"teradata connection details."
+)
+raise

Review Comment:
   I'm curious. How are the original error message look like?
   Does these messages added more value on top of the original error message?
   i would expect the service to return all the needed information in the 
exception itself



##
airflow/providers/teradata/transfers/azure_blob_to_teradata.py:

[PR] Update Hashicorp AWS assume role auth docs [airflow]

2024-04-27 Thread via GitHub


pankajastro opened a new pull request, #39287:
URL: https://github.com/apache/airflow/pull/39287

   Fix: https://github.com/apache/airflow/pull/39279#discussion_r1581736661
   
   
   
   
   
   
   
   ---
   **^ Add meaningful description above**
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a 
newsfragment file, named `{pr_number}.significant.rst` or 
`{issue_number}.significant.rst`, in 
[newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org