[airflow] branch main updated (b60006ae26 -> f89ca94c3e)

2022-11-28 Thread ephraimanierobi
This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from b60006ae26 Add information on running tests with Breeze in PyCharm 
(#27901)
 add f89ca94c3e Fix deadlock when chaining multiple empty mapped tasks 
(#27964)

No new revisions were added by this update.

Summary of changes:
 airflow/models/dagrun.py|  1 +
 tests/models/test_dagrun.py | 49 -
 2 files changed, 49 insertions(+), 1 deletion(-)



[GitHub] [airflow] ephraimbuddy closed issue #27824: DAG Run fails when chaining multiple empty mapped tasks

2022-11-28 Thread GitBox


ephraimbuddy closed issue #27824: DAG Run fails when chaining multiple empty 
mapped tasks
URL: https://github.com/apache/airflow/issues/27824


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ephraimbuddy merged pull request #27964: Fix deadlock when chaining multiple empty mapped tasks

2022-11-28 Thread GitBox


ephraimbuddy merged PR #27964:
URL: https://github.com/apache/airflow/pull/27964


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated: Add information on running tests with Breeze in PyCharm (#27901)

2022-11-28 Thread uranusjr
This is an automated email from the ASF dual-hosted git repository.

uranusjr pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new b60006ae26 Add information on running tests with Breeze in PyCharm 
(#27901)
b60006ae26 is described below

commit b60006ae26c41e887ec0102bce8b726fce54007d
Author: D. Ferruzzi 
AuthorDate: Mon Nov 28 23:02:16 2022 -0800

Add information on running tests with Breeze in PyCharm (#27901)
---
 TESTING.rst|  39 +++--
 images/{ => pycharm}/configure_test_runner.png | Bin
 images/pycharm/pycharm_add_to_context.png  | Bin 0 -> 199729 bytes
 images/pycharm/pycharm_create_tool.png | Bin 0 -> 60139 bytes
 images/{ => pycharm}/running_unittests.png | Bin
 5 files changed, 37 insertions(+), 2 deletions(-)

diff --git a/TESTING.rst b/TESTING.rst
index 26b95eec7d..d47adc6ad1 100644
--- a/TESTING.rst
+++ b/TESTING.rst
@@ -61,13 +61,13 @@ Running Unit Tests from PyCharm IDE
 To run unit tests from the PyCharm IDE, create the `local virtualenv 
`_,
 select it as the default project's environment, then configure your test 
runner:
 
-.. image:: images/configure_test_runner.png
+.. image:: images/pycharm/configure_test_runner.png
 :align: center
 :alt: Configuring test runner
 
 and run unit tests as follows:
 
-.. image:: images/running_unittests.png
+.. image:: images/pycharm/running_unittests.png
 :align: center
 :alt: Running unit tests
 
@@ -75,6 +75,41 @@ and run unit tests as follows:
 (with no Breeze installed) if they do not have dependencies such as
 Postgres/MySQL/Hadoop/etc.
 
+Running Unit Tests from PyCharm IDE using Breeze
+
+
+Ideally, all unit tests should be run using the standardized Breeze 
environment.  While not
+as convenient as the one-click "play button" in PyCharm, the IDE can be 
configured to do
+this in two clicks.
+
+1. Add Breeze as an "External Tool":
+
+   a. From the settings menu, navigate to Tools > External Tools
+   b. Click the little plus symbol to open the "Create Tool" popup and fill it 
out:
+
+.. image:: images/pycharm/pycharm_create_tool.png
+:align: center
+:alt: Installing Python extension
+
+
+2. Add the tool to the context menu:
+
+   a. From the settings menu, navigate to Appearance & Behavior > Menus & 
Toolbars > Project View Popup Menu
+   b. Click on the list of entries where you would like it to be added.  Right 
above or below "Project View Popup Menu Run Group" may be a good choice, you 
can drag and drop this list to rearrange the placement later as desired.
+   c. Click the little plus at the top of the popup window
+   d. Find your "External Tool" in the new "Choose Actions to Add" popup and 
click OK.  If you followed the image above, it will be at External Tools > 
External Tools > Breeze
+
+**Note:** That only adds the option to that one menu.  If you would like to 
add it to the context menu
+when right-clicking on a tab at the top of the editor, for example, follow the 
steps above again
+and place it in the "Editor Tab Popup Menu"
+
+.. image:: images/pycharm/pycharm_add_to_context.png
+:align: center
+:alt: Installing Python extension
+
+3. To run tests in Breeze, right click on the file or directory in the Project 
View and click Breeze.
+
+
 Running Unit Tests from Visual Studio Code
 --
 
diff --git a/images/configure_test_runner.png 
b/images/pycharm/configure_test_runner.png
similarity index 100%
rename from images/configure_test_runner.png
rename to images/pycharm/configure_test_runner.png
diff --git a/images/pycharm/pycharm_add_to_context.png 
b/images/pycharm/pycharm_add_to_context.png
new file mode 100644
index 00..4372827e60
Binary files /dev/null and b/images/pycharm/pycharm_add_to_context.png differ
diff --git a/images/pycharm/pycharm_create_tool.png 
b/images/pycharm/pycharm_create_tool.png
new file mode 100644
index 00..28470e4fbd
Binary files /dev/null and b/images/pycharm/pycharm_create_tool.png differ
diff --git a/images/running_unittests.png b/images/pycharm/running_unittests.png
similarity index 100%
rename from images/running_unittests.png
rename to images/pycharm/running_unittests.png



[GitHub] [airflow] uranusjr merged pull request #27901: Add information on running tests with Breeze in PyCharm

2022-11-28 Thread GitBox


uranusjr merged PR #27901:
URL: https://github.com/apache/airflow/pull/27901


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] uranusjr commented on a diff in pull request #27797: Add tests to PythonOperator

2022-11-28 Thread GitBox


uranusjr commented on code in PR #27797:
URL: https://github.com/apache/airflow/pull/27797#discussion_r1034362366


##
tests/operators/test_python.py:
##
@@ -1155,6 +1169,21 @@ def f():
 )
 copy.deepcopy(task)
 
+def test_except_value_error(self):
+def f():
+return 1
+
+task = PythonVirtualenvOperator(
+python_callable=f,
+task_id="task",
+dag=self.dag,
+)
+
+task.log.error = unittest.mock.Mock()
+task.pickling_library.loads = 
unittest.mock.Mock(side_effect=ValueError)
+with pytest.raises(ValueError):
+task._read_result(path=unittest.mock.Mock())

Review Comment:
   Would be a good idea to use a custom exception class to ensure the exception 
is actually raised by the mocked pickling lib.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] uranusjr commented on a diff in pull request #27797: Add tests to PythonOperator

2022-11-28 Thread GitBox


uranusjr commented on code in PR #27797:
URL: https://github.com/apache/airflow/pull/27797#discussion_r1034361731


##
tests/operators/test_python.py:
##
@@ -1155,6 +1169,21 @@ def f():
 )
 copy.deepcopy(task)
 
+def test_except_value_error(self):
+def f():
+return 1
+
+task = PythonVirtualenvOperator(
+python_callable=f,
+task_id="task",
+dag=self.dag,
+)
+
+task.log.error = unittest.mock.Mock()

Review Comment:
   Maybe this should use `caplog` to actually capture the logs instead of 
mocking it.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] uranusjr commented on a diff in pull request #27797: Add tests to PythonOperator

2022-11-28 Thread GitBox


uranusjr commented on code in PR #27797:
URL: https://github.com/apache/airflow/pull/27797#discussion_r1034360950


##
tests/operators/test_python.py:
##
@@ -375,6 +375,20 @@ def func():
 "INFO:airflow.task.operators:Done. Returned value not shown" in 
cm.output
 ), "Log message that the option is turned off should be shown"
 
+def test_python_operator_templates_exts(self):
+def func():
+return "test_return_value"
+
+python_operator = PythonOperator(
+task_id="python_operator",
+python_callable=func,
+dag=self.dag,
+show_return_value_in_logs=False,
+templates_exts=['test_ext']
+)
+
+assert python_operator.template_ext == ['test_ext']

Review Comment:
   Why is this test relevant? `tempalte_ext` is a property on BaseOperator, is 
it not?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] eladkal commented on a diff in pull request #27710: add a new conf to wait past_deps before skipping a task

2022-11-28 Thread GitBox


eladkal commented on code in PR #27710:
URL: https://github.com/apache/airflow/pull/27710#discussion_r1034347521


##
airflow/cli/cli_parser.py:
##
@@ -543,6 +543,11 @@ def string_lower_type(val):
 help="Ignore depends_on_past dependencies (but respect upstream 
dependencies)",
 action="store_true",
 )
+ARG_WAIT_FOR_PAST_DEPENDS_BEFORE_SKIPPING = Arg(
+("-W", "--wait-for-past-depends-before-skipping"),
+help="Wait for past dependencies before skipping the task when 
--ignore-depends-on-past is not set",
+action="store_true",
+)

Review Comment:
   I agree. We should simplify.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] uranusjr commented on a diff in pull request #27969: Completed D400 for multiple folders

2022-11-28 Thread GitBox


uranusjr commented on code in PR #27969:
URL: https://github.com/apache/airflow/pull/27969#discussion_r1034345312


##
airflow/executors/kubernetes_executor.py:
##
@@ -454,7 +454,7 @@ def __init__(self):
 @provide_session
 def clear_not_launched_queued_tasks(self, session=None) -> None:
 """
-Clear unlaunched tasks that were previously queued.
+Clear launched tasks that were not yet launched, but previously queued.

Review Comment:
   This sentence does not make sense?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] uranusjr commented on a diff in pull request #27710: add a new conf to wait past_deps before skipping a task

2022-11-28 Thread GitBox


uranusjr commented on code in PR #27710:
URL: https://github.com/apache/airflow/pull/27710#discussion_r1034344907


##
airflow/cli/commands/task_command.py:
##
@@ -227,6 +227,7 @@ def _run_task_by_executor(args, dag, ti):
 pickle_id=pickle_id,
 ignore_all_deps=args.ignore_all_dependencies,
 ignore_depends_on_past=args.ignore_depends_on_past,
+
wait_for_past_depends_before_skipping=args.wait_for_past_depends_before_skipping,

Review Comment:
   And we can merge these two _now_ into an enum, it both is clearer and makes 
more sense (since as it’s designed now it is nonsensical to set _both_ flags).



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] uranusjr commented on a diff in pull request #27710: add a new conf to wait past_deps before skipping a task

2022-11-28 Thread GitBox


uranusjr commented on code in PR #27710:
URL: https://github.com/apache/airflow/pull/27710#discussion_r1034343974


##
airflow/cli/cli_parser.py:
##
@@ -543,6 +543,11 @@ def string_lower_type(val):
 help="Ignore depends_on_past dependencies (but respect upstream 
dependencies)",
 action="store_true",
 )
+ARG_WAIT_FOR_PAST_DEPENDS_BEFORE_SKIPPING = Arg(
+("-W", "--wait-for-past-depends-before-skipping"),
+help="Wait for past dependencies before skipping the task when 
--ignore-depends-on-past is not set",
+action="store_true",
+)

Review Comment:
   I wonder if we should merge this and `--ignore-depends-on-past` into one 
flag, say a `--depends-on-past` option that allows three possible values 
`check` (default), `ignore`, and `wait`. This can be done in a subsequent, 
separate PR, but we should start considering the design.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] bdsoha commented on a diff in pull request #27974: [AIP-51] Add helper to import default executor class

2022-11-28 Thread GitBox


bdsoha commented on code in PR #27974:
URL: https://github.com/apache/airflow/pull/27974#discussion_r1034335803


##
airflow/executors/executor_loader.py:
##
@@ -65,18 +65,23 @@ class ExecutorLoader:
 DEBUG_EXECUTOR: "airflow.executors.debug_executor.DebugExecutor",
 }
 
+@classmethod
+def get_default_executor_name(cls) -> str:
+"""Returns the default executor name from Airflow configuration

Review Comment:
   ```suggestion
   """Return the default executor name from Airflow configuration.
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] bdsoha commented on a diff in pull request #27974: [AIP-51] Add helper to import default executor class

2022-11-28 Thread GitBox


bdsoha commented on code in PR #27974:
URL: https://github.com/apache/airflow/pull/27974#discussion_r1034335803


##
airflow/executors/executor_loader.py:
##
@@ -65,18 +65,23 @@ class ExecutorLoader:
 DEBUG_EXECUTOR: "airflow.executors.debug_executor.DebugExecutor",
 }
 
+@classmethod
+def get_default_executor_name(cls) -> str:
+"""Returns the default executor name from Airflow configuration

Review Comment:
   ```suggestion
   """Returns the default executor name from Airflow configuration.
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on a diff in pull request #27970: Replace `unittests` in amazon provider tests by pure `pytest`

2022-11-28 Thread GitBox


Taragolis commented on code in PR #27970:
URL: https://github.com/apache/airflow/pull/27970#discussion_r1034300256


##
tests/providers/amazon/aws/hooks/test_emr_containers.py:
##
@@ -110,7 +109,9 @@ def test_query_status_polling_with_timeout(self, 
mock_session):
 mock_session.return_value = emr_session_mock
 emr_client_mock.describe_job_run.return_value = JOB2_RUN_DESCRIPTION
 
-query_status = 
self.emr_containers.poll_query_status(job_id="job123456", 
max_polling_attempts=2)
+query_status = self.emr_containers.poll_query_status(
+job_id="job123456", max_polling_attempts=2, poll_interval=2
+)

Review Comment:
   This simple change actually speedup test execution from 30 sec to 2 seconds
   
   _before_
   ```
    slowest 100 durations 
=
 30.03s call 
tests/providers/amazon/aws/hooks/test_emr_containers.py::TestEmrContainerHook::test_query_status_polling_with_timeout
   ```
   
   _after_
   ```
   2.01s call 
tests/providers/amazon/aws/hooks/test_emr_containers.py::TestEmrContainerHook::test_query_status_polling_with_timeout
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on a diff in pull request #27970: Replace `unittests` in amazon provider tests by pure `pytest`

2022-11-28 Thread GitBox


Taragolis commented on code in PR #27970:
URL: https://github.com/apache/airflow/pull/27970#discussion_r1034300256


##
tests/providers/amazon/aws/hooks/test_emr_containers.py:
##
@@ -110,7 +109,9 @@ def test_query_status_polling_with_timeout(self, 
mock_session):
 mock_session.return_value = emr_session_mock
 emr_client_mock.describe_job_run.return_value = JOB2_RUN_DESCRIPTION
 
-query_status = 
self.emr_containers.poll_query_status(job_id="job123456", 
max_polling_attempts=2)
+query_status = self.emr_containers.poll_query_status(
+job_id="job123456", max_polling_attempts=2, poll_interval=2
+)

Review Comment:
   This simple changes actually speedup test execution from 30 sec to couple 
seconds
   
   _before_
   ```
    slowest 100 durations 
=
 30.03s call 
tests/providers/amazon/aws/hooks/test_emr_containers.py::TestEmrContainerHook::test_query_status_polling_with_timeout
   ```
   
   _after_
   ```
   2.01s call 
tests/providers/amazon/aws/hooks/test_emr_containers.py::TestEmrContainerHook::test_query_status_polling_with_timeout
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on a diff in pull request #27970: Replace `unittests` in amazon provider tests by pure `pytest`

2022-11-28 Thread GitBox


Taragolis commented on code in PR #27970:
URL: https://github.com/apache/airflow/pull/27970#discussion_r1034285775


##
tests/providers/amazon/aws/operators/test_eks.py:
##
@@ -243,80 +233,74 @@ def 
test_fargate_compute_missing_fargate_pod_execution_role_arn(self):
 missing_fargate_pod_execution_role_arn.execute({})
 
 
-class TestEksCreateFargateProfileOperator(unittest.TestCase):
-def setUp(self) -> None:
-self.create_fargate_profile_params: CreateFargateProfileParams = dict( 
 # type: ignore
+class TestEksCreateFargateProfileOperator:

Review Comment:
   replace TestCase.subTest by parametrize tests



##
tests/providers/amazon/aws/operators/test_eks.py:
##
@@ -92,35 +91,21 @@ class CreateNodegroupParams(TypedDict):
 nodegroup_role_arn: str
 
 
-class TestEksCreateClusterOperator(unittest.TestCase):
-def setUp(self) -> None:
+class TestEksCreateClusterOperator:

Review Comment:
   1. replace TestCase.subTest by parametrize tests
   2. Rename method `nodegroup_setUp` to `nodegroup_setup`
   3. Rename method `fargate_profile_setup ` to `fargate_profile_setup `



##
tests/providers/amazon/aws/operators/test_s3_file_transform.py:
##
@@ -34,37 +29,33 @@
 from airflow.providers.amazon.aws.operators.s3 import S3FileTransformOperator
 
 
-class TestS3FileTransformOperator(unittest.TestCase):
-def setUp(self):
+@pytest.fixture
+def transform_script_loc(request, tmp_path_factory):
+transform_script = tmp_path_factory.mktemp(request.node.name) / 
"transform.py"
+transform_script.touch()
+yield str(transform_script)

Review Comment:
   Create empty "transform" file by  `tmp_path_factory` template instead of 
manual creation temporary directory and clear up it



##
tests/providers/amazon/aws/operators/test_s3_object.py:
##
@@ -18,10 +18,10 @@
 from __future__ import annotations

Review Comment:
   This test I entirely get from https://github.com/apache/airflow/pull/27858  



##
tests/providers/amazon/aws/operators/test_sagemaker_training.py:
##
@@ -57,11 +57,12 @@
 }
 
 
-class TestSageMakerTrainingOperator(unittest.TestCase):
-def setUp(self):
+class TestSageMakerTrainingOperator:
+def setup_method(self):
+self.create_training_params = copy.deepcopy(CREATE_TRAINING_PARAMS)
 self.sagemaker = SageMakerTrainingOperator(
 task_id="test_sagemaker_operator",
-config=CREATE_TRAINING_PARAMS,
+config=self.create_training_params,
 wait_for_completion=False,
 check_interval=5,
 )

Review Comment:
   After migrate to pytest some test failed due to mutability of test 
parameters so this parameters recreate for each test case.



##
tests/providers/amazon/aws/operators/test_sagemaker_transform.py:
##
@@ -50,15 +50,16 @@
 "ExecutionRoleArn": "arn:aws:iam:role/test-role",
 }
 
-CONFIG: dict = {"Model": CREATE_MODEL_PARAMS, "Transform": 
CREATE_TRANSFORM_PARAMS}
 
-
-class TestSageMakerTransformOperator(unittest.TestCase):
-def setUp(self):
+class TestSageMakerTransformOperator:
+def setup_method(self):
+self.create_transform_params = copy.deepcopy(CREATE_TRANSFORM_PARAMS)
+self.create_model_params = copy.deepcopy(CREATE_MODEL_PARAMS)
+self.config = {"Model": self.create_model_params, "Transform": 
self.create_transform_params}

Review Comment:
   After migrate to `pytest` some test failed due to mutability of test 
parameters so this parameters recreate for each test case.



##
tests/providers/amazon/aws/sensors/test_emr_containers.py:
##
@@ -41,36 +40,18 @@ def setUp(self):
 # avoids an Airflow warning about connection cannot be found.
 self.sensor.hook.get_connection = lambda _: None
 
-@mock.patch.object(EmrContainerHook, "check_query_status", 
side_effect=("PENDING",))
-def test_poke_pending(self, mock_check_query_status):
-assert not self.sensor.poke(None)
-
-@mock.patch.object(EmrContainerHook, "check_query_status", 
side_effect=("SUBMITTED",))
-def test_poke_submitted(self, mock_check_query_status):
-assert not self.sensor.poke(None)
-
-@mock.patch.object(EmrContainerHook, "check_query_status", 
side_effect=("RUNNING",))
-def test_poke_running(self, mock_check_query_status):
-assert not self.sensor.poke(None)
-
-@mock.patch.object(EmrContainerHook, "check_query_status", 
side_effect=("COMPLETED",))
-def test_poke_completed(self, mock_check_query_status):
-assert self.sensor.poke(None)
-
-@mock.patch.object(EmrContainerHook, "check_query_status", 
side_effect=("FAILED",))
-def test_poke_failed(self, mock_check_query_status):
-with pytest.raises(AirflowException) as ctx:
-self.sensor.poke(None)
-assert "EMR Containers sensor failed" in str(ctx.value)
-
-@mock.patch.object(EmrContainerHook, "check_query_status", 

[GitHub] [airflow] Bowrna commented on a diff in pull request #27905: listener plugin example added

2022-11-28 Thread GitBox


Bowrna commented on code in PR #27905:
URL: https://github.com/apache/airflow/pull/27905#discussion_r1034275922


##
airflow/example_dags/plugins/event_listener.py:
##
@@ -0,0 +1,77 @@
+from airflow.listeners import hookimpl

Review Comment:
   I am checking this now. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] o-nikolas opened a new issue, #27979: Cancel a step created from EmrAddStepsOperator

2022-11-28 Thread GitBox


o-nikolas opened a new issue, #27979:
URL: https://github.com/apache/airflow/issues/27979

   ### Description
   
   Support an EMR operator for cancelling jobs/steps.
   Created from the following Github Discussion: 
https://github.com/apache/airflow/discussions/27959
   
   ### Use case/motivation
   
   _No response_
   
   ### Related issues
   
   #27959
   
   ### Are you willing to submit a PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] o-nikolas closed issue #26511: Retrieve AWS Cloudwatch logs using aws_conn_id connection for GlueOperator

2022-11-28 Thread GitBox


o-nikolas closed issue #26511: Retrieve AWS Cloudwatch logs using aws_conn_id 
connection for GlueOperator
URL: https://github.com/apache/airflow/issues/26511


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] o-nikolas commented on issue #26511: Retrieve AWS Cloudwatch logs using aws_conn_id connection for GlueOperator

2022-11-28 Thread GitBox


o-nikolas commented on issue #26511:
URL: https://github.com/apache/airflow/issues/26511#issuecomment-1330029164

   > @o-nikolas - given there is no response for more than a month, can we 
close this issue for now? 
   > @ChirangaL - feel free to open it again if the solution doesn't work for 
your use case.
   
   Agreed, I will close the issue


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] shubham22 commented on pull request #26269: Added Feature - Glue job continuously log printing in Airflow task logs

2022-11-28 Thread GitBox


shubham22 commented on PR #26269:
URL: https://github.com/apache/airflow/pull/26269#issuecomment-1330028294

   @nikhi-suthar - checking once more, are you still working on this? cc: 
@o-nikolas 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] shubham22 commented on issue #26511: Retrieve AWS Cloudwatch logs using aws_conn_id connection for GlueOperator

2022-11-28 Thread GitBox


shubham22 commented on issue #26511:
URL: https://github.com/apache/airflow/issues/26511#issuecomment-1330024053

   @o-nikolas - given there is no response for more than a month, can we close 
this issue for now? 
   @ChirangaL - feel free to open it again if the solution doesn't work for 
your use case.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] uranusjr commented on a diff in pull request #27834: Make sure we can get out of a faulty scheduler state

2022-11-28 Thread GitBox


uranusjr commented on code in PR #27834:
URL: https://github.com/apache/airflow/pull/27834#discussion_r1034231937


##
airflow/models/dagrun.py:
##
@@ -780,8 +780,7 @@ def _expand_mapped_task_if_needed(ti: TI) -> Iterable[TI] | 
None:
 except NotMapped:  # Not a mapped task, nothing needed.
 return None
 if expanded_tis:
-assert expanded_tis[0] is ti
-return expanded_tis[1:]
+return expanded_tis

Review Comment:
   If you prefer to keep the new behaviour (return all instead of only new), 
you need to modify the outer logic (naming, docstrings, and comments) to match.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] uranusjr commented on pull request #27898: fix: current_state method on TaskInstance doesn't filter by map_index

2022-11-28 Thread GitBox


uranusjr commented on PR #27898:
URL: https://github.com/apache/airflow/pull/27898#issuecomment-1329979545

   Please fix the CI failures.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] uranusjr commented on pull request #27961: add some missing options for the gcs to bq operator

2022-11-28 Thread GitBox


uranusjr commented on PR #27961:
URL: https://github.com/apache/airflow/pull/27961#issuecomment-1329973707

   CI is failing.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated: allow scroll in triggered dag runs modal (#27965)

2022-11-28 Thread bbovenzi
This is an automated email from the ASF dual-hosted git repository.

bbovenzi pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 5e4f4a3556 allow scroll in triggered dag runs modal (#27965)
5e4f4a3556 is described below

commit 5e4f4a3556db5111c2ae36af1716719a8494efc7
Author: Brent Bovenzi 
AuthorDate: Mon Nov 28 19:16:04 2022 -0600

allow scroll in triggered dag runs modal (#27965)

Co-authored-by: Jed Cunningham 
<66968678+jedcunning...@users.noreply.github.com>
---
 airflow/www/static/js/components/Table/Cells.tsx | 10 +-
 1 file changed, 9 insertions(+), 1 deletion(-)

diff --git a/airflow/www/static/js/components/Table/Cells.tsx 
b/airflow/www/static/js/components/Table/Cells.tsx
index 4850d0772e..6c34f5d9b1 100644
--- a/airflow/www/static/js/components/Table/Cells.tsx
+++ b/airflow/www/static/js/components/Table/Cells.tsx
@@ -110,7 +110,14 @@ export const TriggeredRuns = ({ cell: { value, row } }: 
CellProps) => {
   return (
 
   {value.length}
-  
+  
 
 
   
@@ -127,6 +134,7 @@ export const TriggeredRuns = ({ cell: { value, row } }: 
CellProps) => {
 
   
 



[GitHub] [airflow] bbovenzi closed issue #27936: Datasets triggered run modal is not scrollable

2022-11-28 Thread GitBox


bbovenzi closed issue #27936: Datasets triggered run modal is not scrollable
URL: https://github.com/apache/airflow/issues/27936


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] bbovenzi merged pull request #27965: Allow scroll in triggered dag runs modal

2022-11-28 Thread GitBox


bbovenzi merged PR #27965:
URL: https://github.com/apache/airflow/pull/27965


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] jedcunningham commented on a diff in pull request #27419: Add webserverConfigConfigmapName

2022-11-28 Thread GitBox


jedcunningham commented on code in PR #27419:
URL: https://github.com/apache/airflow/pull/27419#discussion_r1034194499


##
chart/values.yaml:
##
@@ -976,6 +978,7 @@ webserver:
 
   #   # Flask-WTF flag for CSRF
   #   CSRF_ENABLED = True
+  webserverConfigConfigmapName: ~

Review Comment:
   ```suggestion
 webserverConfigConfigMapName: ~
   ```
   We should match the casing used by k8s.



##
tests/charts/test_webserver.py:
##
@@ -738,6 +750,45 @@ def test_webserver_config_configmap(self):
 == jmespath.search('data."webserver_config.py"', docs[0]).strip()
 )
 
+def test_webserver_config_configmap_name_volume_mounts(self):
+configmap_name = "my-configmap"
+docs = render_chart(
+values={
+"scheduler": {"logGroomerSidecar": {"enabled": True}, 
"waitForMigrations": {"enabled": True}},
+"triggerer": {"waitForMigrations": {"enabled": True}},
+"webserver": {
+"waitForMigrations": {"enabled": True},
+"webserverConfig": "CSRF_ENABLED = True  # {{ 
.Release.Name }}",
+"webserverConfigConfigmapName": configmap_name,
+},
+"workers": {"kerberosSidecar": {"enabled": True}, 
"persistence": {"enabled": True}},
+},
+show_only=[
+"templates/scheduler/scheduler-deployment.yaml",
+"templates/triggerer/triggerer-deployment.yaml",
+"templates/webserver/webserver-deployment.yaml",
+"templates/workers/worker-deployment.yaml",

Review Comment:
   This test should probably be moved to `test_airflow_common.py` instead.



##
tests/charts/test_webserver.py:
##
@@ -738,6 +750,45 @@ def test_webserver_config_configmap(self):
 == jmespath.search('data."webserver_config.py"', docs[0]).strip()
 )
 
+def test_webserver_config_configmap_name_volume_mounts(self):
+configmap_name = "my-configmap"
+docs = render_chart(
+values={
+"scheduler": {"logGroomerSidecar": {"enabled": True}, 
"waitForMigrations": {"enabled": True}},
+"triggerer": {"waitForMigrations": {"enabled": True}},
+"webserver": {
+"waitForMigrations": {"enabled": True},
+"webserverConfig": "CSRF_ENABLED = True  # {{ 
.Release.Name }}",
+"webserverConfigConfigmapName": configmap_name,
+},
+"workers": {"kerberosSidecar": {"enabled": True}, 
"persistence": {"enabled": True}},

Review Comment:
   ```suggestion
   "workers": {"kerberosSidecar": {"enabled": True}},
   ```
   
   And here.



##
tests/charts/test_webserver.py:
##
@@ -738,6 +750,45 @@ def test_webserver_config_configmap(self):
 == jmespath.search('data."webserver_config.py"', docs[0]).strip()
 )
 
+def test_webserver_config_configmap_name_volume_mounts(self):
+configmap_name = "my-configmap"
+docs = render_chart(
+values={
+"scheduler": {"logGroomerSidecar": {"enabled": True}, 
"waitForMigrations": {"enabled": True}},
+"triggerer": {"waitForMigrations": {"enabled": True}},
+"webserver": {
+"waitForMigrations": {"enabled": True},

Review Comment:
   ```suggestion
   ```
   
   Same here.



##
tests/charts/test_webserver.py:
##
@@ -738,6 +750,45 @@ def test_webserver_config_configmap(self):
 == jmespath.search('data."webserver_config.py"', docs[0]).strip()
 )
 
+def test_webserver_config_configmap_name_volume_mounts(self):
+configmap_name = "my-configmap"
+docs = render_chart(
+values={
+"scheduler": {"logGroomerSidecar": {"enabled": True}, 
"waitForMigrations": {"enabled": True}},
+"triggerer": {"waitForMigrations": {"enabled": True}},

Review Comment:
   ```suggestion
   ```
   
   We don't need to set these defaults.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] park-peter opened a new issue, #27978: KeyError: 0 error with common-sql version 1.3.0

2022-11-28 Thread GitBox


park-peter opened a new issue, #27978:
URL: https://github.com/apache/airflow/issues/27978

   ### Apache Airflow Provider(s)
   
   common-sql
   
   ### Versions of Apache Airflow Providers
   
   ```
   apache-airflow-providers-amazon==6.0.0
   apache-airflow-providers-apache-hive==4.0.1
   apache-airflow-providers-apache-livy==3.1.0
   apache-airflow-providers-celery==3.0.0
   apache-airflow-providers-cncf-kubernetes==4.4.0
   apache-airflow-providers-common-sql==1.3.0
   apache-airflow-providers-databricks==3.3.0
   apache-airflow-providers-dbt-cloud==2.2.0
   apache-airflow-providers-elasticsearch==4.2.1
   apache-airflow-providers-ftp==3.1.0
   apache-airflow-providers-google==8.4.0
   apache-airflow-providers-http==4.0.0
   apache-airflow-providers-imap==3.0.0
   apache-airflow-providers-microsoft-azure==4.3.0
   apache-airflow-providers-postgres==5.2.2
   apache-airflow-providers-redis==3.0.0
   apache-airflow-providers-sftp==4.1.0
   apache-airflow-providers-snowflake==3.3.0
   apache-airflow-providers-sqlite==3.2.1
   apache-airflow-providers-ssh==3.2.0
   ```
   
   ### Apache Airflow version
   
   2.4.3
   
   ### Operating System
   
   Debian Bullseye
   
   ### Deployment
   
   Astronomer
   
   ### Deployment details
   
   _No response_
   
   ### What happened
   
   With the latest version of common-sql provider, the `get_records` from hook 
is now a ordinary dictionary, causing this KeyError with SqlSensor:
   ```
   [2022-11-29, 00:39:18 UTC] {taskinstance.py:1851} ERROR - Task failed with 
exception
   Traceback (most recent call last):
 File "/usr/local/lib/python3.9/site-packages/airflow/sensors/base.py", 
line 189, in execute
   poke_return = self.poke(context)
 File 
"/usr/local/lib/python3.9/site-packages/airflow/providers/common/sql/sensors/sql.py",
 line 98, in poke
   first_cell = records[0][0]
   KeyError: 0
   ```
   I have only tested with Snowflake, I haven't tested it with other databases. 
Reverting back to 1.2.0 solves the issue.
   
   ### What you think should happen instead
   
   _No response_
   
   ### How to reproduce
   
   ```
   from datetime import datetime
   
   from airflow import DAG
   from airflow.operators.bash import BashOperator
   from airflow.providers.common.sql.sensors.sql import SqlSensor
   
   with DAG(
   dag_id="snowflake_deferrable",
   schedule=None,
   start_date=datetime(2022, 1, 1),
   catchup=False,
   ):
   t1 = SqlSensor(
   task_id="snowflake_test",
   conn_id="snowflake",
   sql="select 0",
   fail_on_empty=False,
   poke_interval=20,
   mode="poke",
   timeout=60 * 5,
   )
   ```
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] jedcunningham commented on a diff in pull request #27779: Chart: add airflow_local_settings to all airflow containers

2022-11-28 Thread GitBox


jedcunningham commented on code in PR #27779:
URL: https://github.com/apache/airflow/pull/27779#discussion_r1034190493


##
tests/charts/test_cleanup_pods.py:
##
@@ -239,6 +239,42 @@ def test_should_set_job_history_limits(self):
 assert 2 == jmespath.search("spec.failedJobsHistoryLimit", docs[0])
 assert 4 == jmespath.search("spec.successfulJobsHistoryLimit", docs[0])
 
+def test_no_airflow_local_settings(self):
+docs = render_chart(
+values={
+"cleanup": {
+"enabled": True,
+"failedJobsHistoryLimit": 2,
+"successfulJobsHistoryLimit": 4,
+},

Review Comment:
   ```suggestion
   "cleanup": {"enabled": True},
   ```



##
tests/charts/test_cleanup_pods.py:
##
@@ -239,6 +239,42 @@ def test_should_set_job_history_limits(self):
 assert 2 == jmespath.search("spec.failedJobsHistoryLimit", docs[0])
 assert 4 == jmespath.search("spec.successfulJobsHistoryLimit", docs[0])
 
+def test_no_airflow_local_settings(self):
+docs = render_chart(
+values={
+"cleanup": {
+"enabled": True,
+"failedJobsHistoryLimit": 2,
+"successfulJobsHistoryLimit": 4,
+},
+"airflowLocalSettings": None,
+},
+show_only=["templates/cleanup/cleanup-cronjob.yaml"],
+)
+volume_mounts = jmespath.search(
+"spec.jobTemplate.spec.template.spec.containers[0].volumeMounts", 
docs[0]
+)
+assert "airflow_local_settings.py" not in str(volume_mounts)
+
+def test_airflow_local_settings(self):
+docs = render_chart(
+values={
+"cleanup": {
+"enabled": True,
+"failedJobsHistoryLimit": 2,
+"successfulJobsHistoryLimit": 4,
+},

Review Comment:
   ```suggestion
   "cleanup": {"enabled": True},
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] github-actions[bot] commented on pull request #27063: [WIP/Preview] Feature/26215 Proposal for AIP-50 Trigger UI based on FAB

2022-11-28 Thread GitBox


github-actions[bot] commented on PR #27063:
URL: https://github.com/apache/airflow/pull/27063#issuecomment-1329908972

   This pull request has been automatically marked as stale because it has not 
had recent activity. It will be closed in 5 days if no further activity occurs. 
Thank you for your contributions.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch constraints-main updated: Updating constraints. Build id:

2022-11-28 Thread github-bot
This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a commit to branch constraints-main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/constraints-main by this push:
 new daeb30fc5d Updating constraints. Build id:
daeb30fc5d is described below

commit daeb30fc5d26b185c46339e7cd39c584e094147f
Author: Automated GitHub Actions commit 
AuthorDate: Tue Nov 29 00:06:07 2022 +

Updating constraints. Build id:

This update in constraints is automatically committed by the CI 
'constraints-push' step based on
HEAD of '' in ''
with commit sha .

All tests passed in this build so we determined we can push the updated 
constraints.

See 
https://github.com/apache/airflow/blob/main/README.md#installing-from-pypi for 
details.
---
 constraints-3.10.txt  | 18 +-
 constraints-3.7.txt   | 20 ++--
 constraints-3.8.txt   | 18 +-
 constraints-3.9.txt   | 18 +-
 constraints-no-providers-3.10.txt |  4 ++--
 constraints-no-providers-3.7.txt  |  4 ++--
 constraints-no-providers-3.8.txt  |  4 ++--
 constraints-no-providers-3.9.txt  |  4 ++--
 constraints-source-providers-3.10.txt | 18 +-
 constraints-source-providers-3.7.txt  | 20 ++--
 constraints-source-providers-3.8.txt  | 18 +-
 constraints-source-providers-3.9.txt  | 18 +-
 12 files changed, 82 insertions(+), 82 deletions(-)

diff --git a/constraints-3.10.txt b/constraints-3.10.txt
index ba0fe3ff1a..e582655094 100644
--- a/constraints-3.10.txt
+++ b/constraints-3.10.txt
@@ -1,5 +1,5 @@
 #
-# This constraints file was automatically generated on 2022-11-27T21:23:42Z
+# This constraints file was automatically generated on 2022-11-29T00:05:26Z
 # via "eager-upgrade" mechanism of PIP. For the "main" branch of Airflow.
 # This variant of constraints install uses the HEAD of the branch version for 
'apache-airflow' but installs
 # the providers from PIP-released packages at the moment of the constraint 
generation.
@@ -172,9 +172,9 @@ billiard==3.6.4.0
 black==22.10.0
 bleach==5.0.1
 blinker==1.5
-boto3==1.26.16
+boto3==1.26.17
 boto==2.49.0
-botocore==1.29.16
+botocore==1.29.17
 bowler==0.9.0
 cachelib==0.9.0
 cachetools==4.2.2
@@ -328,7 +328,7 @@ inflection==0.5.1
 influxdb-client==1.34.0
 iniconfig==1.1.1
 ipdb==0.13.9
-ipython==8.6.0
+ipython==8.7.0
 isodate==0.6.1
 isort==5.10.1
 itsdangerous==2.1.2
@@ -347,7 +347,7 @@ jsonpointer==2.3
 jsonschema==4.17.1
 junit-xml==1.9
 jupyter-client==7.3.4
-jupyter_core==5.0.0
+jupyter_core==5.1.0
 keyring==23.11.0
 kombu==5.2.4
 krb5==0.4.1
@@ -383,7 +383,7 @@ msrestazure==0.6.4
 multi-key-dict==2.0.3
 multidict==6.0.2
 mypy-boto3-appflow==1.26.15
-mypy-boto3-rds==1.26.11.post1
+mypy-boto3-rds==1.26.17
 mypy-boto3-redshift-data==1.26.0.post1
 mypy-extensions==0.4.3
 mypy==0.971
@@ -588,11 +588,11 @@ types-cryptography==3.3.23.2
 types-docutils==0.19.1.1
 types-freezegun==1.1.10
 types-paramiko==2.12.0.1
-types-protobuf==4.21.0.0
+types-protobuf==4.21.0.1
 types-python-dateutil==2.8.19.4
 types-python-slugify==7.0.0.1
 types-pytz==2022.6.0.1
-types-redis==4.3.21.5
+types-redis==4.3.21.6
 types-requests==2.28.11.5
 types-setuptools==65.6.0.1
 types-tabulate==0.9.0.0
@@ -610,7 +610,7 @@ urllib3==1.26.13
 userpath==1.8.0
 vertica-python==1.1.1
 vine==5.0.0
-virtualenv==20.16.7
+virtualenv==20.17.0
 volatile==2.1.0
 watchtower==2.0.1
 wcwidth==0.2.5
diff --git a/constraints-3.7.txt b/constraints-3.7.txt
index 16a8a6ad29..760dce2da3 100644
--- a/constraints-3.7.txt
+++ b/constraints-3.7.txt
@@ -1,5 +1,5 @@
 #
-# This constraints file was automatically generated on 2022-11-27T21:24:03Z
+# This constraints file was automatically generated on 2022-11-29T00:06:03Z
 # via "eager-upgrade" mechanism of PIP. For the "main" branch of Airflow.
 # This variant of constraints install uses the HEAD of the branch version for 
'apache-airflow' but installs
 # the providers from PIP-released packages at the moment of the constraint 
generation.
@@ -172,9 +172,9 @@ billiard==3.6.4.0
 black==22.10.0
 bleach==5.0.1
 blinker==1.5
-boto3==1.26.16
+boto3==1.26.17
 boto==2.49.0
-botocore==1.29.16
+botocore==1.29.17
 bowler==0.9.0
 cached-property==1.5.2
 cachelib==0.9.0
@@ -210,7 +210,7 @@ croniter==1.3.8
 cryptography==36.0.2
 curlify==2.2.1
 dask==2022.2.0
-databricks-sql-connector==2.2.0
+databricks-sql-connector==2.2.1
 datadog==0.44.0
 db-dtypes==1.0.4
 decorator==5.1.1
@@ -348,7 +348,7 @@ jsonpointer==2.3
 jsonschema==4.17.1
 junit-xml==1.9
 jupyter-client==7.3.4
-jupyter_core==4.11.2
+jupyter_core==4.12.0
 keyring==23.11.0
 kombu==5.2.4
 krb5==0.4.1
@@ -384,7 +384,7 @@ msrestazure==0.6.4
 multi-key-dict==2.0.3
 multidict==6.0.2
 mypy-boto3-appflow==1.26.15
-mypy-boto3-rds==1.26.11.post1
+mypy-boto3-rds==1.26.17
 

[GitHub] [airflow] potiuk commented on pull request #27962: Add pre-commits preventing accidental API changes in common.sql

2022-11-28 Thread GitBox


potiuk commented on PR #27962:
URL: https://github.com/apache/airflow/pull/27962#issuecomment-1329896616

   All looks good :) 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated: Clarify docstrings for updated DbApiHook (#27966)

2022-11-28 Thread dstandish
This is an automated email from the ASF dual-hosted git repository.

dstandish pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new a158fbb6bd Clarify docstrings for updated DbApiHook (#27966)
a158fbb6bd is described below

commit a158fbb6bde07cd20003680a4cf5e7811b9eda98
Author: Daniel Standish <15932138+dstand...@users.noreply.github.com>
AuthorDate: Mon Nov 28 16:12:39 2022 -0700

Clarify docstrings for updated DbApiHook (#27966)
---
 airflow/providers/common/sql/hooks/sql.py | 57 ---
 1 file changed, 29 insertions(+), 28 deletions(-)

diff --git a/airflow/providers/common/sql/hooks/sql.py 
b/airflow/providers/common/sql/hooks/sql.py
index 06dbdb2f57..76260612a4 100644
--- a/airflow/providers/common/sql/hooks/sql.py
+++ b/airflow/providers/common/sql/hooks/sql.py
@@ -35,19 +35,20 @@ def return_single_query_results(sql: str | Iterable[str], 
return_last: bool, spl
 Determines when results of single query only should be returned.
 
 For compatibility reasons, the behaviour of the DBAPIHook is somewhat 
confusing.
-In cases, when multiple queries are run, the return values will be an 
iterable (list) of results
-- one for each query. However, in certain cases, when single query is run 
- the results will be just
-the results of that single query without wrapping the results in a list.
+In some cases, when multiple queries are run, the return value will be an 
iterable (list) of results
+-- one for each query. However, in other cases, when single query is run, 
the return value will be just
+the result of that single query without wrapping the results in a list.
 
-The cases when single query results are returned without wrapping them in 
a list are when:
+The cases when single query results are returned without wrapping them in 
a list are as follows:
 
-a) sql is string and last_statement is True (regardless what 
split_statement value is)
-b) sql is string and split_statement is False
+a) sql is string and ``return_last`` is True (regardless what 
``split_statements`` value is)
+b) sql is string and ``split_statements`` is False
 
-In all other cases, the results are wrapped in a list, even if there is 
only one statement to process:
+In all other cases, the results are wrapped in a list, even if there is 
only one statement to process.
+In particular, the return value will be a list of query results in the 
following circumstances:
 
-a) always when sql is an iterable of string statements (regardless what 
last_statement value is)
-b) when sql is string, split_statement is True and last_statement is False
+a) when ``sql`` is an iterable of string statements (regardless what 
``return_last`` value is)
+b) when ``sql`` is string, ``split_statements`` is True and 
``return_last`` is False
 
 :param sql: sql to run (either string or list of strings)
 :param return_last: whether last statement output should only be returned
@@ -272,33 +273,33 @@ class DbApiHook(BaseForDbApiHook):
 where each element in the list are results of one of the queries 
(typically list of list of rows :D)
 
 For compatibility reasons, the behaviour of the DBAPIHook is somewhat 
confusing.
-In cases, when multiple queries are run, the return values will be an 
iterable (list) of results
-- one for each query. However, in certain cases, when single query is 
run - the results will be just
-the results of that query without wrapping the results in a list.
+In some cases, when multiple queries are run, the return value will be 
an iterable (list) of results
+-- one for each query. However, in other cases, when single query is 
run, the return value will
+be the result of that single query without wrapping the results in a 
list.
 
-The cases when single query results are returned without wrapping them 
in a list are when:
+The cases when single query results are returned without wrapping them 
in a list are as follows:
 
-a) sql is string and last_statement is True (regardless what 
split_statement value is)
-b) sql is string and split_statement is False
+a) sql is string and ``return_last`` is True (regardless what 
``split_statements`` value is)
+b) sql is string and ``split_statements`` is False
 
-In all other cases, the results are wrapped in a list, even if there 
is only one statement to process:
+In all other cases, the results are wrapped in a list, even if there 
is only one statement to process.
+In particular, the return value will be a list of query results in the 
following circumstances:
 
-a) always when sql is an iterable of string statements (regardless 
what last_statement value is)
-b) when sql is string, split_statement is True 

[GitHub] [airflow] dstandish merged pull request #27966: Clarify docstrings for updated DbApiHook

2022-11-28 Thread GitBox


dstandish merged PR #27966:
URL: https://github.com/apache/airflow/pull/27966


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] sunghospark-calm commented on issue #27561: Helm chart tries to patch immutable Job resources on helm upgrade

2022-11-28 Thread GitBox


sunghospark-calm commented on issue #27561:
URL: https://github.com/apache/airflow/issues/27561#issuecomment-1329870939

   Running into the same problem here.
   Had to disable the hooks because migration was not running properly during 
upgrade, without hooks the followup deployment fails because of the job spec 
conflict.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ahipp13 commented on issue #23512: Random "duplicate key value violates unique constraint" errors when initializing the postgres database

2022-11-28 Thread GitBox


ahipp13 commented on issue #23512:
URL: https://github.com/apache/airflow/issues/23512#issuecomment-1329870020

   Issue submitted https://github.com/apache/airflow/issues/27977
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ahipp13 opened a new issue, #27977: Webserver IntegrityError When Loaded

2022-11-28 Thread GitBox


ahipp13 opened a new issue, #27977:
URL: https://github.com/apache/airflow/issues/27977

   ### Apache Airflow version
   
   2.4.3
   
   ### What happened
   
   We just upgraded from Airflow 2.2.5 to 2.4.3. When using Azure OAuth, 
whenever you load the webserver, it loads it with an error and in a weird 
looking format:
   
![image](https://user-images.githubusercontent.com/118911990/204397885-b60f951e-d57c-4daa-b8bb-b468ed183852.png)
   
   It loads it in a weird different format every time you hit the refresh 
button, and has the same error every time (full logs down below):
   ```
   airflow-web sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) 
duplicate key value violates unique constraint "session_session_id_key"
   airflow-web DETAIL: Key (session_id)=(7582aee0-4289-4152-99d2-918c98ff64d6) 
already exists.
   airflow-web
   airflow-web [SQL: INSERT INTO session (session_id, data, expiry) VALUES 
(%(session_id)s, %(data)s, %(expiry)s) RETURNING session.id]
   airflow-web [parameters: {'session_id': 
'7582aee0-4289-4152-99d2-918c98ff64d6', 'data': , 'expiry': datetime.datetime(2022, 11, 23, 16, 5, 59, 
467562, tzinfo=datetime.timezone.utc)}]
   airflow-web (Background on this error at: https://sqlalche.me/e/14/gkpj)
   airflow-web
   ```
   
   ### What you think should happen instead
   
   The webserver should display the normal login page without any errors. Here 
are the full logs from the kubernetes pod:
   
   webserver.logs
   airflow-web 127.0.0.1 - - [23/Nov/2022:15:35:59 +] "GET /login/ 
HTTP/1.1" 200 16330 "https://login.microsoftonline.com/; "Mozilla/5.0 (Windows 
NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/106.0.0.0 
Safari/537.36 Edg/106.0.1370.52"
   airflow-web 127.0.0.1 - - [23/Nov/2022:15:35:59 +] "GET 
/static/appbuilder/css/font-awesome.min.css HTTP/1.1" 200 0 
"https://url/login/; "Mozilla/5.0 (Windows NT 10.0; Win64; x64) 
AppleWebKit/537.36 (KHTML, like Gecko) Chrome/106.0.0.0 Safari/537.36 
Edg/106.0.1370.52"
   airflow-web [2022-11-23 15:35:59,423] {app.py:1741} ERROR - Exception on 
/static/appbuilder/css/bootstrap.min.css [GET]
   airflow-web Traceback (most recent call last):
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", 
line 1802, in _execute_context
   airflow-web self.dialect.do_execute(
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/engine/default.py",
 line 719, in do_execute
   airflow-web cursor.execute(statement, parameters)
   airflow-web psycopg2.errors.UniqueViolation: duplicate key value violates 
unique constraint "session_session_id_key"
   airflow-web DETAIL: Key (session_id)=(7582aee0-4289-4152-99d2-918c98ff64d6) 
already exists.
   airflow-web
   airflow-web return super().save_session(*args, **kwargs)
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/flask_session/sessions.py", 
line 578, in save_session
   airflow-web self.db.session.commit()
   airflow-web File "", line 2, in commit
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/session.py", 
line 1428, in commit
   airflow-web self._transaction.commit(_to_root=self.future)
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/session.py", 
line 829, in commit
   airflow-web self._prepare_impl()
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/session.py", 
line 808, in _prepare_impl
   airflow-web self.session.flush()
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/session.py", 
line 3345, in flush
   airflow-web self._flush(objects)
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/session.py", 
line 3485, in flush
   airflow-web transaction.rollback(capture_exception=True)
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/util/langhelpers.py",
 line 70, in exit
   airflow-web compat.raise(
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/util/compat.py", 
line 207, in raise
   airflow-web raise exception
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/session.py", 
line 3445, in _flush
   airflow-web flush_context.execute()
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/unitofwork.py",
 line 456, in execute
   airflow-web rec.execute(self)
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/unitofwork.py",
 line 630, in execute
   airflow-web util.preloaded.orm_persistence.save_obj(
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/persistence.py",
 line 244, in save_obj
   airflow-web _emit_insert_statements(
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/persistence.py",
 line 1221, in _emit_insert_statements
   

[GitHub] [airflow] mag3141592 opened a new issue, #27976: `SQLColumnCheckOperator` failures after upgrading to `common-sql==1.3.0`

2022-11-28 Thread GitBox


mag3141592 opened a new issue, #27976:
URL: https://github.com/apache/airflow/issues/27976

   ### Apache Airflow Provider(s)
   
   common-sql
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-google==8.2.0
   apache-airflow-providers-http==4.0.0
   apache-airflow-providers-salesforce==5.0.0
   apache-airflow-providers-slack==5.1.0
   apache-airflow-providers-snowflake==3.2.0
   
   Issue:
   apache-airflow-providers-common-sql==1.3.0
   
   
   ### Apache Airflow version
   
   2.4.3
   
   ### Operating System
   
   Debian GNU/Linux 11 (bullseye)
   
   ### Deployment
   
   Astronomer
   
   ### Deployment details
   
   _No response_
   
   ### What happened
   
   Problem occurred when upgrading from common-sql=1.2.0 to common-sql=1.3.0
   
   
   Getting a `KEY_ERROR` when running a unique_check and null_check on a column.
   
   1.3.0 log:
   https://user-images.githubusercontent.com/15257610/204390144-97ae35b7-1a2c-4ee1-9c12-4f3940047cde.png;>
   
   1.2.0 log:
   https://user-images.githubusercontent.com/15257610/204389994-7e8eae17-a346-41ac-84c4-9de4be71af20.png;>
   
   
   ### What you think should happen instead
   
   Potential causes:
   - seems to be indexing based on the test query column `COL_NAME` instead of 
the table column `STRIPE_ID`
   - the `record` from the test changed types went from a tuple to a list of 
dictionaries.
   - no `tolerance` is specified for these tests, so `.get('tolerance')` looks 
like it will cause an error without a default specified like `.get('tolerance', 
None)`
   
   Expected behavior:
   - these tests continue to pass with the upgrade
   - `tolerance` is not a required key.
   
   ### How to reproduce
   
   ```
   from datetime import datetime
   from airflow import DAG
   
   from airflow.providers.snowflake.operators.snowflake import SnowflakeOperator
   from airflow.providers.common.sql.operators.sql import SQLColumnCheckOperator
   
   my_conn_id = "snowflake_default"
   
   default_args={"conn_id": my_conn_id}
   
   with DAG(
   dag_id="airflow_providers_example",
   schedule=None,
   start_date=datetime(2022, 11, 27),
   default_args=default_args,
   ) as dag:
   
   create_table = SnowflakeOperator(
   task_id="create_table",
   sql=""" CREATE OR REPLACE TABLE testing AS (
   SELECT
   1 AS row_num,
   NULL AS field
   
   UNION ALL
   
   SELECT
   2 AS row_num,
   'test' AS field
   
   UNION ALL
   
   SELECT
   3 AS row_num,
   'test' AS field
   )""",
   )
   
   column_checks = SQLColumnCheckOperator(
   task_id="column_checks",
   table="testing",
   column_mapping={
   "field": {"unique_check": {"equal_to": 0}, "null_check": 
{"equal_to": 0}}
   },
   )
   
   create_table >> column_checks
   
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] boring-cyborg[bot] commented on issue #27976: `SQLColumnCheckOperator` failures after upgrading to `common-sql==1.3.0`

2022-11-28 Thread GitBox


boring-cyborg[bot] commented on issue #27976:
URL: https://github.com/apache/airflow/issues/27976#issuecomment-1329865991

   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] skyboi1233 commented on issue #27975: expand or expand_kwargs()

2022-11-28 Thread GitBox


skyboi1233 commented on issue #27975:
URL: https://github.com/apache/airflow/issues/27975#issuecomment-1329864233

   On Mon, Nov 28, 2022 at 4:59 PM tkansara ***@***.***> wrote:
   
   > Closed #27975  as
   > completed.
   >
   > —
   > Reply to this email directly, view it on GitHub
   > , or
   > unsubscribe
   > 

   > .
   > You are receiving this because you are subscribed to this thread.Message
   > ID: ***@***.***>
   >
   -- 
   Skyler Williams
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] tkansara closed issue #27975: expand or expand_kwargs()

2022-11-28 Thread GitBox


tkansara closed issue #27975: expand or expand_kwargs()
URL: https://github.com/apache/airflow/issues/27975


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] tkansara commented on issue #27975: expand or expand_kwargs()

2022-11-28 Thread GitBox


tkansara commented on issue #27975:
URL: https://github.com/apache/airflow/issues/27975#issuecomment-1329862566

   I believe documentation is correct, and I am missing some link here. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] tkansara commented on issue #27975: expand or expand_kwargs()

2022-11-28 Thread GitBox


tkansara commented on issue #27975:
URL: https://github.com/apache/airflow/issues/27975#issuecomment-1329860852

   Or am is this correct and I am missing something... Ty for checking. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] pierrejeambrun commented on pull request #27974: [AIP-51] Add helper to import default executor class

2022-11-28 Thread GitBox


pierrejeambrun commented on PR #27974:
URL: https://github.com/apache/airflow/pull/27974#issuecomment-1329857439

   Really useful indeed, thanks. I'll update that when this one is merged :)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] tkansara opened a new issue, #27975: expand or expand_kwargs()

2022-11-28 Thread GitBox


tkansara opened a new issue, #27975:
URL: https://github.com/apache/airflow/issues/27975

   ### What do you see as an issue?
   
   In the section: 
   
https://airflow.apache.org/docs/apache-airflow/stable/concepts/dynamic-task-mapping.html#assigning-multiple-parameters-to-a-non-taskflow-operator
   
   there is piece of code:
   `copy_filenames = S3CopyObjectOperator.partial(
   task_id="copy_files", source_bucket_name=list_filenames.bucket
   ).expand_kwargs(copy_kwargs)`
   
   Is the expand_kwargs() correct, or should it be expand()
   
   ### Solving the problem
   
   _No response_
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #23512: Random "duplicate key value violates unique constraint" errors when initializing the postgres database

2022-11-28 Thread GitBox


potiuk commented on issue #23512:
URL: https://github.com/apache/airflow/issues/23512#issuecomment-1329853497

   Not sure why - but yes. I ping the wrong person :)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] zachliu commented on issue #23512: Random "duplicate key value violates unique constraint" errors when initializing the postgres database

2022-11-28 Thread GitBox


zachliu commented on issue #23512:
URL: https://github.com/apache/airflow/issues/23512#issuecomment-1329847946

   @potiuk you probably `@` the wrong guy :grin: 
   
   @ahipp13 Can you please open a new issue for 2.4.3 with more details? I also 
have a feeling this is a different issue. this issue is centered around the 
`ab_permission_view_role_permission_view_id_role_id_key` which is role 
management & permission related, yours is the `session_session_id_key`, my 
guess is somehow you have a duplicate session


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] o-nikolas commented on pull request #27974: [AIP-51] Add helper to import default executor class

2022-11-28 Thread GitBox


o-nikolas commented on PR #27974:
URL: https://github.com/apache/airflow/pull/27974#issuecomment-1329843566

   @pierrejeambrun 
   This would be nice to use for #27941 and future PRs!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] o-nikolas opened a new pull request, #27974: [AIP-51] Add helper to import default executor class

2022-11-28 Thread GitBox


o-nikolas opened a new pull request, #27974:
URL: https://github.com/apache/airflow/pull/27974

   This will be useful for the several points during AIP-51 where the default 
executor class needs to be imported to replace specific points of coupling.
   Also add missing testing for executor loader import mechanism
   
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a 
newsfragment file, named `{pr_number}.significant.rst` or 
`{issue_number}.significant.rst`, in 
[newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] pierrejeambrun commented on pull request #27966: Clarify docstrings for updated DbApiHook

2022-11-28 Thread GitBox


pierrejeambrun commented on PR #27966:
URL: https://github.com/apache/airflow/pull/27966#issuecomment-1329842548

   Haha I am familiar to this as well .
   
   What I do now is configure mypy and flake as linter in my IDE. This way it 
is really easy to see/resolve and iterate on typing error. Then the pre-commit 
hook are just an extra check :)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ahipp13 opened a new issue, #27973: Azure OAuth CSRF State Not Equal Error

2022-11-28 Thread GitBox


ahipp13 opened a new issue, #27973:
URL: https://github.com/apache/airflow/issues/27973

   ### Apache Airflow version
   
   2.4.3
   
   ### What happened
   
   We have enabled Microsoft Azure OAuth for our Airflow implementation. When 
we try to log in, we get a CSRF error: 
   
   [2022-11-28 22:04:58,744] {views.py:659} ERROR - Error authorizing OAuth 
access token: mismatching_state: CSRF Warning! State not equal in request and 
response. ││ 
airflow-web [2022-11-28 22:04:58,744] {views.py:659} ERROR - Error authorizing 
OAuth access token: mismatching_state: CSRF Warning! State not equal in request 
and response.
   
   We have taken a look at both the sending and receiving URLs in the browser 
and the state is the exact same. Down below are pictures of the state for the 
request and response:
   
![2022-11-28_16-22-46](https://user-images.githubusercontent.com/118911990/204394405-2e7a0029-9b18-4090-89e6-52a5a41dc25d.png)
   
![2022-11-28_16-26-24](https://user-images.githubusercontent.com/118911990/204394418-d2603bbf-6668-4fda-ad75-58fce46c3d44.png)
   
   
   ### What you think should happen instead
   
   We should be able to log into our Airflow application. We had the exact same 
setup using Airflow 2.2.5 and everything worked just fine. 
   
   ### How to reproduce
   
   Down below is a copy of our webserver_config.py. We are currently running 
Airflow 2.4.3 on Kubernetes with the Airflow Community helm chart version 8.6.1 
(located here: https://github.com/airflow-helm/charts). We are also using a 
postgres external database as our metadata db. 
   
   
   ```
   from flask_appbuilder.security.manager import AUTH_OAUTH
   from airflow.www.security import AirflowSecurityManager
   import logging
   from typing import Dict, Any, List, Union
   import os
   import sys
   
   #Add this as a module to pythons path
   sys.path.append('/opt/airflow')
   
   log = logging.getLogger(__name__)
   log.setLevel(os.getenv("AIRFLOW__LOGGING__FAB_LOGGING_LEVEL", "DEBUG"))
   
   class AzureCustomSecurity(AirflowSecurityManager):
   # In this example, the oauth provider == 'azure'.
   # If you ever want to support other providers, see how it is done here:
   # 
https://github.com/dpgaspar/Flask-AppBuilder/blob/master/flask_appbuilder/security/manager.py#L550
   def get_oauth_user_info(self, provider, resp):
   # Creates the user info payload from Azure.
   # The user previously allowed your app to act on their behalf,
   #   so now we can query the user and teams endpoints for their data.
   # Username and team membership are added to the payload and returned 
to FAB.
   if provider == "azure":
   log.debug("Azure response received : {0}".format(resp))
   id_token = resp["id_token"]
   log.debug(str(id_token))
   me = self._azure_jwt_token_parse(id_token)
   log.debug("Parse JWT token : {0}".format(me))
   return {
   "name": me.get("name", ""),
   "email": me["upn"],
   "first_name": me.get("given_name", ""),
   "last_name": me.get("family_name", ""),
   "id": me["oid"],
   "username": me["oid"],
   "role_keys": me.get("roles", []),
   }
   
   # Adding this in because if not the redirect url will start with http and we 
want https
   os.environ["AIRFLOW__WEBSERVER__ENABLE_PROXY_FIX"] = "True"
   WTF_CSRF_ENABLED = False
   CSRF_ENABLED = False
   AUTH_TYPE = AUTH_OAUTH
   AUTH_ROLES_SYNC_AT_LOGIN = True  # Checks roles on every login
   # Make sure to replace this with the path to your security manager class
   FAB_SECURITY_MANAGER_CLASS = "webserver_config.AzureCustomSecurity"
   # a mapping from the values of `userinfo["role_keys"]` to a list of FAB roles
   AUTH_ROLES_MAPPING = {
   "airflow_dev_admin": ["Admin"],
   "airflow_dev_op": ["Op"],
   "airflow_dev_user": ["User"],
   "airflow_dev_viewer": ["Viewer"]
   }
   # force users to re-auth after 30min of inactivity (to keep roles in sync)
   PERMANENT_SESSION_LIFETIME = 1800
   # If you wish, you can add multiple OAuth providers.
   OAUTH_PROVIDERS = [
   {
   "name": "azure",
   "icon": "fa-windows",
   "token_key": "access_token",
   "remote_app": {
   "client_id": "CLIENT_ID",
   "client_secret": 'AZURE_DEV_CLIENT_SECRET',
   "api_base_url": "https://login.microsoftonline.com/TENANT_ID;,
   "request_token_url": None,
   'request_token_params': {
   'scope': 'openid email profile'
   },
   "access_token_url": 
"https://login.microsoftonline.com/TENANT_ID/oauth2/v2.0/token;,
   "access_token_params": {
   'scope': 'openid email profile'
   },

[GitHub] [airflow] boring-cyborg[bot] commented on issue #27973: Azure OAuth CSRF State Not Equal Error

2022-11-28 Thread GitBox


boring-cyborg[bot] commented on issue #27973:
URL: https://github.com/apache/airflow/issues/27973#issuecomment-1329841579

   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] dstandish commented on pull request #27966: Clarify docstrings for updated DbApiHook

2022-11-28 Thread GitBox


dstandish commented on PR #27966:
URL: https://github.com/apache/airflow/pull/27966#issuecomment-1329838369

   > BTW. `SKIP=run-mypy,run-flake8` as global env might help even if there are 
some slow-downs
   
   yes this is how i have it configured


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #27966: Clarify docstrings for updated DbApiHook

2022-11-28 Thread GitBox


potiuk commented on PR #27966:
URL: https://github.com/apache/airflow/pull/27966#issuecomment-1329838043

   BTW. `SKIP=run-mypy,run-flake8` as global env might help even if there are 
some slow-downs


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] dstandish commented on a diff in pull request #27971: Experiment: Run static checks and postgres tests before all else

2022-11-28 Thread GitBox


dstandish commented on code in PR #27971:
URL: https://github.com/apache/airflow/pull/27971#discussion_r1034114562


##
.github/workflows/ci.yml:
##
@@ -778,8 +780,8 @@ jobs:
 needs: [build-info, wait-for-ci-images]
 strategy:
   matrix:
-python-version: 
"${{fromJson(needs.build-info.outputs.python-versions)}}"
-postgres-version: 
"${{fromJson(needs.build-info.outputs.postgres-versions)}}"
+python-version: ["3.7"]

Review Comment:
   yeah both this and putting `test-postgres` everywhere was *just* for testing



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #27966: Clarify docstrings for updated DbApiHook

2022-11-28 Thread GitBox


potiuk commented on PR #27966:
URL: https://github.com/apache/airflow/pull/27966#issuecomment-1329836965

   > > Installed pre-commits really save on iterating with CI @dstandish 
   > 
   > yeah, sigh... i have them enabled but i have flake8 and mypy disabled 
because
   > 
   > iteration time 
   > 
   > just too slow on mac... although it was a while ago i last checked
   
   Let's speed'em up. On my M1 I have completely no problems with slow 
iteration time on commits :)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] dstandish commented on pull request #27966: Clarify docstrings for updated DbApiHook

2022-11-28 Thread GitBox


dstandish commented on PR #27966:
URL: https://github.com/apache/airflow/pull/27966#issuecomment-1329836048

   > Installed pre-commits really save on iterating with CI @dstandish 
   
   yeah, sigh... i have them enabled but i have flake8 and mypy disabled 
because
   
   iteration time  
   
   just too slow on mac... although it was a while ago i last checked


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] dstandish commented on a diff in pull request #27966: Clarify docstrings for updated DbApiHook

2022-11-28 Thread GitBox


dstandish commented on code in PR #27966:
URL: https://github.com/apache/airflow/pull/27966#discussion_r1034112520


##
airflow/providers/common/sql/hooks/sql.py:
##
@@ -272,33 +273,33 @@ def run(
 where each element in the list are results of one of the queries 
(typically list of list of rows :D)
 
 For compatibility reasons, the behaviour of the DBAPIHook is somewhat 
confusing.
-In cases, when multiple queries are run, the return values will be an 
iterable (list) of results
-- one for each query. However, in certain cases, when single query is 
run - the results will be just
-the results of that query without wrapping the results in a list.
+In some cases, when multiple queries are run, the return value will be 
an iterable (list) of results
+-- one for each query. However, in other cases, when single query is 
run, the return value will be just
+the result of that single query without wrapping the results in a list.

Review Comment:
   ```suggestion
   In some cases, when multiple queries are run, the return value will 
be an iterable (list) of results
   -- one for each query. However, in other cases, when single query is 
run, the return value will
   be the result of that single query without wrapping the results in a 
list.
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #27264: Attempt to add Python 3.11 support

2022-11-28 Thread GitBox


potiuk commented on PR #27264:
URL: https://github.com/apache/airflow/pull/27264#issuecomment-1329829469

   FYI. Databricks-connector-sql has just been released with Python 3.11 
compatibility. One step less to go.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] dstandish commented on pull request #27971: Experiment: Run static checks and postgres tests before all else

2022-11-28 Thread GitBox


dstandish commented on PR #27971:
URL: https://github.com/apache/airflow/pull/27971#issuecomment-1329825428

   it would help if there were too many PRs competing for too few runners -- 
then we quickly get a quick read on the tests most likely to fail.  but that 
doesn't really seem to be the case here.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #23512: Random "duplicate key value violates unique constraint" errors when initializing the postgres database

2022-11-28 Thread GitBox


potiuk commented on issue #23512:
URL: https://github.com/apache/airflow/issues/23512#issuecomment-1329825007

   Can you please open a new issue for 2.4.3 @zachliu with more details? I have 
a feeling this is somehow different issue.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] dstandish commented on pull request #27971: Experiment: Run static checks and postgres tests before all else

2022-11-28 Thread GitBox


dstandish commented on PR #27971:
URL: https://github.com/apache/airflow/pull/27971#issuecomment-1329823364

   yeah i agree.  i was actually just about to close the PR.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] dstandish closed pull request #27971: Experiment: Run static checks and postgres tests before all else

2022-11-28 Thread GitBox


dstandish closed pull request #27971: Experiment: Run static checks and 
postgres tests before all else
URL: https://github.com/apache/airflow/pull/27971


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] dstandish commented on a diff in pull request #27971: Experiment: Run static checks and postgres tests before all else

2022-11-28 Thread GitBox


dstandish commented on code in PR #27971:
URL: https://github.com/apache/airflow/pull/27971#discussion_r1034103825


##
.github/workflows/ci.yml:
##
@@ -366,7 +368,7 @@ jobs:
   ${{needs.build-info.outputs.build-job-description}} PROD images
   ${{needs.build-info.outputs.all-python-versions-list-as-string}}
 runs-on: "${{needs.build-info.outputs.runs-on}}"
-needs: [build-info, build-ci-images]
+needs: [build-info, build-ci-images, tests-postgres]

Review Comment:
   i agree, if we want to do something like that... this would be the way to go.
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #27971: Experiment: Run static checks and postgres tests before all else

2022-11-28 Thread GitBox


potiuk commented on PR #27971:
URL: https://github.com/apache/airflow/pull/27971#issuecomment-1329820819

   I am not sure if this is helpful,
   
   For non-hosted runners it should be no difference. We already have 900 
runners by the ASF and we should have no more queueing here (at least we 
stopped seeing those) - so this is pretty ok to run all test types in parallel. 
And it will introduce significant "sequential" delays in case of non-postgres 
failures I am afraid. Rather than running tests in parallel, it will run them 
sequentially so "feedback time" will be biggeer.
   
   When optimising CI builds there are two aspects:
   
   * feedback time
   * cost (build hours usage)
   
   Feedback time is very important - the  time between push and "result" should 
be as short as possible. Putting sequential dependency between tests increases 
feedback time, but saves cost (assuming job queue is infinite - and with 900 
jobs in the ASF as of 3 weeks, for now - until ASF usage grows significantly it 
almost is - we used to have 300) . Cost does not matter - it is essentially 
free.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on a diff in pull request #27971: Experiment: Run static checks and postgres tests before all else

2022-11-28 Thread GitBox


potiuk commented on code in PR #27971:
URL: https://github.com/apache/airflow/pull/27971#discussion_r1034097967


##
.github/workflows/ci.yml:
##
@@ -366,7 +368,7 @@ jobs:
   ${{needs.build-info.outputs.build-job-description}} PROD images
   ${{needs.build-info.outputs.all-python-versions-list-as-string}}
 runs-on: "${{needs.build-info.outputs.runs-on}}"
-needs: [build-info, build-ci-images]
+needs: [build-info, build-ci-images, tests-postgres]

Review Comment:
   I suggest to introduce an empty "smoke-tests" job that will be dependent o 
tests-postgres and use dependency like.
   
   ```
   [build-info, smoke-tests]
   ```
   
   This will be much more "generic". Unlike build-info (which is used in 
"needs.build-info", we do not need `build-ci-images` if "smoke-tests" (or 
"tests-postgres" depend on it already. Unfortunately github actions 
dependencies are not transitive and if we need "build-info" outputs, in `needs` 
clause we need to add it here.
   
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on a diff in pull request #27971: Experiment: Run static checks and postgres tests before all else

2022-11-28 Thread GitBox


potiuk commented on code in PR #27971:
URL: https://github.com/apache/airflow/pull/27971#discussion_r1034097967


##
.github/workflows/ci.yml:
##
@@ -366,7 +368,7 @@ jobs:
   ${{needs.build-info.outputs.build-job-description}} PROD images
   ${{needs.build-info.outputs.all-python-versions-list-as-string}}
 runs-on: "${{needs.build-info.outputs.runs-on}}"
-needs: [build-info, build-ci-images]
+needs: [build-info, build-ci-images, tests-postgres]

Review Comment:
   I suggest to introduce an empty "smoke-tests" job that will be dependent o 
tests-postgres and use dependency like.
   
   ```
   [build-info, smoke-tests]
   ```
   
   This will be much more "generic". Unlike build-info (which is used in 
"needs.build-info", we do not need `build-ci-images" if "smoke-tests" (or 
"tests-postgres" depend on it already. Unfortunately github actions 
dependencies are not transitive and if we need "build-info" outputs, in `needs` 
clause we need to add it here.
   
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ahipp13 commented on issue #23512: Random "duplicate key value violates unique constraint" errors when initializing the postgres database

2022-11-28 Thread GitBox


ahipp13 commented on issue #23512:
URL: https://github.com/apache/airflow/issues/23512#issuecomment-1329812257

   We are still having this issue in 2.4.3. We set [webserver].workers to 1 for 
a workaround. Here were the logs we were seeing:
   
   airflow-web 127.0.0.1 - - [23/Nov/2022:15:35:59 +] "GET /login/ 
HTTP/1.1" 200 16330 "https://login.microsoftonline.com/; "Mozilla/5.0 (Windows 
NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/106.0.0.0 
Safari/537.36 Edg/106.0.1370.52"
   airflow-web 127.0.0.1 - - [23/Nov/2022:15:35:59 +] "GET 
/static/appbuilder/css/font-awesome.min.css HTTP/1.1" 200 0 
"https://dev-edw-airflow.hcck8s-ctc-np1.optum.com/login/; "Mozilla/5.0 (Windows 
NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/106.0.0.0 
Safari/537.36 Edg/106.0.1370.52"
   airflow-web [2022-11-23 15:35:59,423] {app.py:1741} ERROR - Exception on 
/static/appbuilder/css/bootstrap.min.css [GET]
   airflow-web Traceback (most recent call last):
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", 
line 1802, in _execute_context
   airflow-web self.dialect.do_execute(
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/engine/default.py",
 line 719, in do_execute
   airflow-web cursor.execute(statement, parameters)
   airflow-web psycopg2.errors.UniqueViolation: duplicate key value violates 
unique constraint "session_session_id_key"
   airflow-web DETAIL: Key (session_id)=(7582aee0-4289-4152-99d2-918c98ff64d6) 
already exists.
   airflow-web
   airflow-web return super().save_session(*args, **kwargs)
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/flask_session/sessions.py", 
line 578, in save_session
   airflow-web self.db.session.commit()
   airflow-web File "", line 2, in commit
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/session.py", 
line 1428, in commit
   airflow-web self._transaction.commit(_to_root=self.future)
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/session.py", 
line 829, in commit
   airflow-web self._prepare_impl()
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/session.py", 
line 808, in _prepare_impl
   airflow-web self.session.flush()
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/session.py", 
line 3345, in flush
   airflow-web self._flush(objects)
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/session.py", 
line 3485, in flush
   airflow-web transaction.rollback(capture_exception=True)
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/util/langhelpers.py",
 line 70, in exit
   airflow-web compat.raise(
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/util/compat.py", 
line 207, in raise
   airflow-web raise exception
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/session.py", 
line 3445, in _flush
   airflow-web flush_context.execute()
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/unitofwork.py",
 line 456, in execute
   airflow-web rec.execute(self)
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/unitofwork.py",
 line 630, in execute
   airflow-web util.preloaded.orm_persistence.save_obj(
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/persistence.py",
 line 244, in save_obj
   airflow-web _emit_insert_statements(
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/persistence.py",
 line 1221, in _emit_insert_statements
   airflow-web result = connection._execute_20(
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", 
line 1614, in _execute_20
   airflow-web return meth(self, args_10style, kwargs_10style, 
execution_options)
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/sql/elements.py", 
line 325, in _execute_on_connection
   airflow-web return connection._execute_clauseelement(
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", 
line 1481, in _execute_clauseelement
   airflow-web ret = self._execute_context(
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", 
line 1845, in _execute_context
   airflow-web self.handle_dbapi_exception(
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/engine/base.py", 
line 2026, in handle_dbapi_exception
   airflow-web util.raise(
   airflow-web File 
"/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/util/compat.py", 
line 207, in raise
   airflow-web raise exception
   airflow-web File 

[GitHub] [airflow] potiuk commented on a diff in pull request #27971: Experiment: Run static checks and postgres tests before all else

2022-11-28 Thread GitBox


potiuk commented on code in PR #27971:
URL: https://github.com/apache/airflow/pull/27971#discussion_r1034095249


##
.github/workflows/ci.yml:
##
@@ -778,8 +780,8 @@ jobs:
 needs: [build-info, wait-for-ci-images]
 strategy:
   matrix:
-python-version: 
"${{fromJson(needs.build-info.outputs.python-versions)}}"
-postgres-version: 
"${{fromJson(needs.build-info.outputs.postgres-versions)}}"
+python-version: ["3.7"]

Review Comment:
   This is a bit dangerous and not needed. For most cases (as of last week) the 
exclude matrices are only selecting "representative" combos for tests - so you 
do not save by hard-conding those I think.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated (b39457f0e3 -> 4a391150aa)

2022-11-28 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from b39457f0e3 Fix deprecation warnings from 
`tests/api_connexion/endpoints/test_extra_link_endpoint.py` (#27956)
 add 4a391150aa Documentation for the LocalTaskJob return code counter 
(#27972)

No new revisions were added by this update.

Summary of changes:
 docs/apache-airflow/logging-monitoring/metrics.rst | 88 +++---
 1 file changed, 45 insertions(+), 43 deletions(-)



[GitHub] [airflow] potiuk merged pull request #27972: Documentation for the LocalTaskJob return code counter

2022-11-28 Thread GitBox


potiuk merged PR #27972:
URL: https://github.com/apache/airflow/pull/27972


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #27969: Completed D400 for multiple folders

2022-11-28 Thread GitBox


potiuk commented on PR #27969:
URL: https://github.com/apache/airflow/pull/27969#issuecomment-1329806323

   Checks :( 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #27966: Clarify docstrings for updated DbApiHook

2022-11-28 Thread GitBox


potiuk commented on PR #27966:
URL: https://github.com/apache/airflow/pull/27966#issuecomment-1329805947

   Installed pre-commits really save on iterating with CI @dstandish  


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #27966: Clarify docstrings for updated DbApiHook

2022-11-28 Thread GitBox


potiuk commented on PR #27966:
URL: https://github.com/apache/airflow/pull/27966#issuecomment-1329805138

   Line too long :( 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] JCoder01 commented on issue #27523: Jumping tasks in grid

2022-11-28 Thread GitBox


JCoder01 commented on issue #27523:
URL: https://github.com/apache/airflow/issues/27523#issuecomment-1329804692

   Taking a longer look at mine, I'm seeing the same thing. This occurs on a 
dag with a task group, when the task group is expanded the task group and 
downstream tasks move around. It alternates between being correctly ordered and 
the expanded task group and the downstream tasks appearing before the completed 
upstream tasks.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] howardyoo commented on pull request #27787: POC / WIP testing out using logfmt for more structured logging

2022-11-28 Thread GitBox


howardyoo commented on PR #27787:
URL: https://github.com/apache/airflow/pull/27787#issuecomment-1329803476

   Right, the logging is currently experimental. That's why it was excluded
   for now.
   
   On Mon, Nov 28, 2022 at 3:32 PM Jarek Potiuk ***@***.***>
   wrote:
   
   > I looked at the AIP, but it looks like logging was deliberately excluded
   > from it:
   >
   > Rather deferred until it's ready. The AIP was approved few months ago and
   > it will take likely few months before someone starts working on it. In the
   > meantime OpenTelemetry's logging has evolved. I guess you should see what's
   > the status of logging there. The direction is clear.
   >
   > —
   > Reply to this email directly, view it on GitHub
   > ,
   > or unsubscribe
   > 

   > .
   > You are receiving this because you were mentioned.Message ID:
   > ***@***.***>
   >
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on a diff in pull request #27962: Add pre-commits preventing accidental API changes in common.sql

2022-11-28 Thread GitBox


potiuk commented on code in PR #27962:
URL: https://github.com/apache/airflow/pull/27962#discussion_r1034088447


##
airflow/providers/common/sql/README_API.md:
##
@@ -0,0 +1,103 @@
+
+
+# Keeping the API of common.sql provider backwards compatible
+
+The API of the `common.sql` provider should be kept backwards compatible with 
previously released versions.
+The reason is - there are already released providers that use `common.sql` 
provider and rely on the API and
+behaviour of the `common.sql` provider, and any updates in the API or 
behaviour of it, might potentially
+break those released providers.
+
+Therefore, we should keep an extra care when changing the APIs.
+
+The approach we take is similar to one that has been applied by Android OS 
team to keep their API in check,
+and it is based on storing the current version of API and flagging changes 
that are potentially breaking.
+This is done by comparing the previous API (store in stub files) and the 
upcoming API from the PR.
+The upcoming API is automatically extracted from `common.sql` Python files 
using `update-common-sql-api-stubs`
+pre-commit using mypy `stubgen` and stored as `.pyi` files in the 
`airflow.providers.common.sql` package.
+We also post-process the `.pyi` files to add some historically exposed methods 
that should be also
+considered as public API.
+
+If the comparison determines that the change is potentially breaking, the 
contributor is asked
+to review the changes and manually regenerate the stub files.
+
+The details of the workflow are as follows:
+
+1) The previous API is stored in the (committed to repository) stub files.
+2) Every time when common.sql Python files are modified the 
`update-common-sql-api-stubs` pre-commit
+  regenerates the stubs (including post-processing it) and looks for 
potentially breaking changes
+   (removals or updates of the existing classes/methods).
+3) If the check reveals there are no changes to the API, nothing happens, 
pre-commit succeeds.
+4) If there are only additions, the pre-commit automatically updates the stub 
files,
+   asks the contributor to commit resulting updates and fails the pre-commit. 
This is very similar to
+   other static checks that automatically modify/fix source code.
+5) If the pre-commit detects potentially breaking changes, the process is a 
bit more involved for the
+   contributor. The pre-commit flags such changes to the contributor by 
failing the pre-commit and
+   asks the contributor to review the change looking specifically for breaking 
compatibility with previous
+   providers (and fix any backwards compatibility). Once this is completed, 
the contributor is asked to
+   manually and explicitly regenerate and commit the new version of the stubs 
by running the pre-commit
+   with manually added environment variable:
+
+```shell
+UPDATE_COMMON_SQL_API=1 pre-commit run update-common-sql-api-stubs
+```
+
+# Verifying other providers to use only public API of the `common.sql` provider
+
+MyPy automatically checks if only the methods available in the stubs are used. 
This gives enough protection
+for

Review Comment:
   Right 臘 . 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #27593: Object of type V1Pod is not JSON serializable after detecting zombie jobs cause Scheduler CrashLoopBack

2022-11-28 Thread GitBox


potiuk commented on issue #27593:
URL: https://github.com/apache/airflow/issues/27593#issuecomment-1329796733

   Yes. Please provide the details. Asking to "reopen" without providing any 
more details is not really helpful in diagnosing and fixing an issue. 
   
   Already closed issue has no chance to be solved if there is not enough 
details for a new appearance (likely different root cause) to be able to 
diagnose and reproduce it - so just asking anyone to reopen an issue without 
details that would help to diagnose it is counterproductive -  makes no 
difference of whether it will be solved or not.
   
   On the other hand - providing useful information, stack-traces, 
circumstances and results of your investigation, actually enormously help in 
any attempt to try to investigate and fix it. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #27963: DatabricksSQLOperator ValueError “too many values to unpack”

2022-11-28 Thread GitBox


potiuk commented on issue #27963:
URL: https://github.com/apache/airflow/issues/27963#issuecomment-1329791335

   Closing as fixed - provisionally 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk closed issue #27963: DatabricksSQLOperator ValueError “too many values to unpack”

2022-11-28 Thread GitBox


potiuk closed issue #27963:  DatabricksSQLOperator ValueError “too many values 
to unpack”
URL: https://github.com/apache/airflow/issues/27963


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #27963: DatabricksSQLOperator ValueError “too many values to unpack”

2022-11-28 Thread GitBox


potiuk commented on issue #27963:
URL: https://github.com/apache/airflow/issues/27963#issuecomment-1329788860

   > Interesting note: the 3.3.0 DatabricksSQLOperator had a parameter for 
control writing to xcom. It appears to be removed in 4.0.0. This is causing the 
dump of a lot of PII into XCOM for my use case.
   
   Nope. You can still use do_xcom_push=False coming from BaseOperator. Did you 
try it?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #27787: POC / WIP testing out using logfmt for more structured logging

2022-11-28 Thread GitBox


potiuk commented on PR #27787:
URL: https://github.com/apache/airflow/pull/27787#issuecomment-1329787455

   > I looked at the AIP, but it looks like logging was deliberately excluded 
from it:
   
   Rather deferred until it's ready. The AIP was approved few months ago and it 
will take likely few months before someone starts working on it. In the 
meantime OpenTelemetry's logging has evolved. I guess you should see what's the 
status of logging there. The direction is clear. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #26492: Cannot fetch log from Celery worker

2022-11-28 Thread GitBox


potiuk commented on issue #26492:
URL: https://github.com/apache/airflow/issues/26492#issuecomment-1329780031

   > Hi @potiuk , I ran into this error on 2.4.3. The env I'm using is directly 
from the official docker compose file from 
[here](https://airflow.apache.org/docs/apache-airflow/2.4.3/docker-compose.yaml).
   The way to reproduce it is to create and run a simple dag and check the log 
from UI.
   Thanks for looking into it.
   
   Just to clarify ut - the docker compose is NOT official compose. It's a 
quick-start docker compose that can get you up an running but if you have anu 
problems with it (in your environment) you should apply your docker compose 
expertise to be able to analyse it, I am afraid (and this is very clearly 
written 
[here](https://airflow.apache.org/docs/apache-airflow/2.4.3/howto/docker-compose/index.html)
 as "Caution".
   
   > This procedure can be useful for learning and exploration. However, 
adapting it for use in real-world situations can be complicated. Making changes 
to this procedure will require specialized expertise in Docker & Docker 
Compose, and the Airflow community may not be able to help you.
   
   > For that reason, we recommend using Kubernetes with the [Official Airflow 
Community Helm 
Chart](https://airflow.apache.org/docs/helm-chart/stable/index.html) when you 
are ready to run Airflow in production.
   
   I just followed (again as a million times before) all the steps one-by-one 
from our quick-start and unfortunately I cannot reproduce your problem. This is 
what I get. 
   
   https://user-images.githubusercontent.com/595491/204383801-b210587c-a073-4a99-9032-7be2c8354f5d.png;>
   
   So NO. It's not easily reproducible. You have to do better.
   
   Docker compose functionality depends on many local factors. This is one of 
the reasons why we do not recommend people who are not already experts in 
docker compose to use it - there is a learning curve involved and unfortunately 
when someone has expectations that things will "just work", they are gravely 
mistaken. You need to dive deep and understand a lot that happens under the 
hood in order to actually run anything (not only airflow) in docker compose. 
And we just warned you that you need to do a lot of investigations on your own 
- and provide useful information if you want help. Writing "it does not work 
for me" is not a useful information that will help those who want to help in 
their free time (because this is what is happening here - I am responding to 
you in my free time). But you give me little chance to help me without spending 
your own time first.

   If you want any help for your case where likely you have some 
misconfiguration of some sort. We are not ablet to magically guess with your 
docker, docker-compose configuration unfortunately - and if you really need 
some help with solving your problem 
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] bt- commented on issue #27523: Jumping tasks in grid

2022-11-28 Thread GitBox


bt- commented on issue #27523:
URL: https://github.com/apache/airflow/issues/27523#issuecomment-1329752163

   I am running into the same thing with my DAG. I have the tasks within my 
TaskGroups dynamically generated based on a list parsed from a yaml file. After 
watching more closely, it is my TaskGroups that move around in the grid view. 
The tasks within the TaskGroup aren't moving. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] IKholopov opened a new pull request, #27972: Documentation for the LocalTaskJob return code counter

2022-11-28 Thread GitBox


IKholopov opened a new pull request, #27972:
URL: https://github.com/apache/airflow/pull/27972

   Adding a missing documentation for the recently added metric.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #27866: Cloudera Airflow provider.

2022-11-28 Thread GitBox


potiuk commented on PR #27866:
URL: https://github.com/apache/airflow/pull/27866#issuecomment-1329743517

   > With such a big chunk of code and functionality and Cloudera behind it, I 
think we can only accept the code knowing that both ICLA ( by the person 
submitting it) and CCLA (by Cloudera) are signed:
   
   That one is I think fulfilled (would be great if you mention that in the 
voting thread). I checked that this is in the CCLA records of the foundation:
   
   ```
   Cloudera, Inc. - Woo Kim:w...@cloudera.com:Signed Corp CLA for Philippe 
Lanoe, Gyorgy Orban, Ihor Lukianov, Eugene Anufriev, Vedant Lodha
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] dstandish closed pull request #27971: Experiment: Run static checks and postgres tests before all else

2022-11-28 Thread GitBox


dstandish closed pull request #27971: Experiment: Run static checks and 
postgres tests before all else
URL: https://github.com/apache/airflow/pull/27971


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #27866: Cloudera Airflow provider.

2022-11-28 Thread GitBox


potiuk commented on PR #27866:
URL: https://github.com/apache/airflow/pull/27866#issuecomment-1329740644

   Also one extra requirement. 
   
   With such a big chunk of code and functionality and Cloudera behind it, I 
think we can only accept the code knowing that both ICLA ( by the person 
submitting it) and CCLA (by Cloudera) are signed: 
   
   https://www.apache.org/licenses/contributor-agreements.html


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #27866: Cloudera Airflow provider.

2022-11-28 Thread GitBox


potiuk commented on PR #27866:
URL: https://github.com/apache/airflow/pull/27866#issuecomment-1329736900

   > What is not 100% clear from the documentation in our case is that Cloudera 
provider would be introduced with a min dependency of 2.3.0.
   
   Yes. As a rule comunity does not maintain pre-2.3.0 providers as of October 
11th 2022 (12 months after 2.2.0 was released). Any new providers accepted will 
have to follow that rule, otherwise it would require extra maintenance burden 
for the community. 
   
   You can still release your providers in the version which is compatible with 
2.3.0. Differnt package and maintained by Cloudera.  
   
   >  In the case we want to maintain compatibility for older versions, we 
would have to open a PR with this provider against an older version of Airflow, 
and it would automatically be released (however only technically, only 
supported by the maintainer and not the community) with the next wave of 
provider release along with the latest version of the provider, which is 
supported by the community? Is this statement correct?
   
   No it's not. We never release any new provider which is compatible with 
pre-2.3.0 (currently  - in April it will be 2.4.0) Airlfow version.  We can 
at-most release patchlevel (bugfix) version with only changes that are 
cherry-picked from main - no new features, no new providers either.
   
   This patchlevel release is never automatic and we actually have never 
expected to release providers for older versions - we have no process for that 
- the process that we have is prepared for releasing a bugfix-only (patchlevel 
no new featurs)  to already released old provider version (and there the burden 
is on someone who wants to do it and only cherry-pick existing changes and 
update release documentation - then we might attempt to release it, but this 
should always be exceptional case involving communication on the devlist and 
announcing the readiness of it. Basically whoever takes lead there up to the 
last moment of preparing the package, is responsible for building and testing 
it. 
   
   But this is not even your case now. We have no automation (nor time to help 
in updating our automation) to just release old versions of new provider 
retroactively. If we accept it, we will release 2.3.0+ only and if you want to 
release a compatible package with pre-2.3.0 support (but with different name - 
you should never release anything which have apache-airflow-providers-* in PyPI 
without going through Apache Airflow release process). And then you are fully 
on your own with compatiblity/bugfixes etc. Of course that's my point of view, 
and you can convince the community to do it otherwise, but I am not even sure 
what to do if this approach will be taken. 
   
   Also  - regardless of the above rules that we have in the community - I 
think you should  call for a vote on a devlist for the community, whether we 
want to accept the provider. We are not automatically accepting any provider, 
it has to be justified and there various people and various opinions whether we 
should accept maintenance of more providers or not.  Accepting a provider is 
not only "benefit" but also "burden" that community will have to accept. 
   
I believe it is quite substantial piece of code that you want to hand-over 
to Apache Software Foundation, with some extra dependencies and we need to make 
sure we really want to take over the maintenance of it.
   
   Since you have now PR and we know the full extent of it, I suggest you start 
voting thread on our devlist 
https://www.apache.org/foundation/voting.html#votes-on-code-modification with 
some justifications and let's see what the voting decide.  Havig the PR, 
looking at the code/quality and the burden it will be easier to make the 
decision I think - but I believe it is important-enough case to call for a 
formal vote rather than rely on simple PR approval.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #27843: Restore removed (but used) methods in common.sql

2022-11-28 Thread GitBox


potiuk commented on PR #27843:
URL: https://github.com/apache/airflow/pull/27843#issuecomment-1329707915

   > Is there an ETA to officially release new version 1.3.1 and get rid of 
1.3.0? All my pipelines in Dev are broken until "_get_failed_checks" is 
recovered. Thanks!
   
   We are in the process of voting and testing the releae now. And it is 
governed by the https://www.apache.org/foundation/voting.html#ReleaseVotes and 
current release are in this discussion on our devlist 
https://lists.apache.org/thread/cvzkqhdqhwhyb3gp81nzs9zcl3hjwxnz
   
   Yu can downgrade to 1.2.0 as workaround. 
   
   But also if If you want to help with testing and making us more confident 
with the release I heartily invite you @luixi79 to test RC3 candidates - all 
details are in #27939 . You will find information about the RC3 candidates and 
by following the PyPI links you should be able to install the RC. 
   
   And I also encourage you to report the status, whether your problem is fixed 
and generally helping in raising confidence that the release is good. 
Responding on the devlist is also cool - stating for example that it fixes your 
problem and cannot wait for it.
   
   Voting process of Apache Software Foundation lasts for 72 hours and this 
period ends tomorrow mid-day CET. But whether we will have enough votes by that 
time or wait longer also depends on the confidence we have from people like you 
testing the RCs and telling us "yeah, it works, I tested it - it solves my 
problem".
   
   So when you are asking me for ETA, my response is - the more you (and users 
like you) help with building confidence the  faster it happens - but not sooner 
than tomorrow mid-day CET.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #24988: sendgrid as a data source provider? #24815

2022-11-28 Thread GitBox


potiuk commented on issue #24988:
URL: https://github.com/apache/airflow/issues/24988#issuecomment-1329685452

   > Just to double check my understanding of the release cycle as outlined 
[here](https://github.com/apache/airflow#release-process-for-providers). If I 
implement this provider, does that require me to continue to maintain the 
versioning of this provider indefinitely?
   
   What is in main and accepted by the community, is maintained by the 
community. It would be nice if you stick around and help to solve issues, but 
if it is sufficiently covered by tests, others will be able to fix the problems 
raised. 
   
   Our CI is pretty comprehensive and review is rather detailed, so once the PR 
is merged, all the requiremetns are checked.  If you get it to the point it 
will be mergeable and merged (and approved), there is nothing more from your 
side "expected", but for sure staying around will be nice to take some care 
(especially about teething problems). But Airflow is done by volunteers, there 
is no way we can force or oblige anyone (that includes also commiters and PMC 
members) to continue contributing. As an individual you have  completely free 
will.
   
   There might a number of problems that might make it dfficult to merge it 
(dependency issues mainly) - because we need to make sure all our providers are 
working "together" and that we have a consistent set of dependencies. Community 
work is only focused on "main" releases of providers - basically we build and 
release only latest verison (and depending on changes applied by the community 
- it might be patchlevel/feature/breaking changes. But we also do not heve a 
way currently to run the tests against "real" services - so we rely on thoso 
who contribute changes to test them - but this also means that some versions of 
providers might get occasionally broken.
   
   For those who wish to give more care - we have the 
[AIP-47](https://cwiki.apache.org/confluence/display/AIRFLOW/AIP-47+New+design+of+Airflow+System+Tests)
 effort - to allow external parties to use "example_dags" as system tests and 
run them regularly - and anyone can volunteer to do so and to build a pipeline 
of running actual "system tests" running against real services. This is not 
mandatory and requires some party (individual or organisation) to volunteer to 
do it. Currently Amazon and Google are preparing to run such regular tests on 
their provider. Others (like Databricks) might follow (I hope) - if Sendgrid 
would like to do it - they can do it as well - or you could do it as well. But 
we - in the community have no capacity to run them.
   
   Finally if any party wants to release the provider for one of the past 
relases of it - they can do it by branching off that past release and 
cherry-picking the changes and fixing all the problems they find - once they do 
we then can release the past version (this is what the documentation you linked 
is mostly about).  But this is only if someone wants to release past version of 
any provider for the reasons of backwards compatibility for example. Latest 
release is always prepared and relased by the provider release manager from the 
code that is currently in main and it only happens when there are some changes 
since last release (or when we bump minor airflow version). And there author of 
the change is usually asked to help with testing the released package (not 
original author of it). In the first release after it is merged, your changes 
will be there, so you will be asked for help in testing it. 
   
   Again - you do not HAVE TO do it - there is no way we can force anyone to do 
it - those expectations above however are kind of "good citizenship" 
expectations and it's great if you can comply . 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ashb commented on issue #26497: Upgrading to airflow 2.4.0 from 2.3.4 causes NotNullViolation error

2022-11-28 Thread GitBox


ashb commented on issue #26497:
URL: https://github.com/apache/airflow/issues/26497#issuecomment-1329683155

   Argh! Good catch


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] dstandish opened a new pull request, #27971: Experiment: Run static checks and postgres tests before all else

2022-11-28 Thread GitBox


dstandish opened a new pull request, #27971:
URL: https://github.com/apache/airflow/pull/27971

   This should allow for faster failure / iteration time on non-hosted runners.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] joshowen commented on issue #26497: Upgrading to airflow 2.4.0 from 2.3.4 causes NotNullViolation error

2022-11-28 Thread GitBox


joshowen commented on issue #26497:
URL: https://github.com/apache/airflow/issues/26497#issuecomment-1329681251

   > > We started with 1.8.xx, went to 1.9.xx, 1.10.xx, and somehow all of our 
FAB tables ended up without sequences set for their IDs, but had the sequences 
created. We were seeing similar issues in 2.4.0, and manually ran:
   > > ```sql
   > > ALTER TABLE "public"."ab_permission_view" ALTER COLUMN "id" SET DEFAULT 
nextval('ab_permission_view_id_seq'::regclass);
   > > ALTER TABLE "public"."ab_permission" ALTER COLUMN "id" SET DEFAULT 
nextval('ab_permission_id_seq'::regclass);
   > > ALTER TABLE "public"."ab_permission_view_role" ALTER COLUMN "id" SET 
DEFAULT nextval('ab_permission_view_role_id_seq'::regclass);
   > > ALTER TABLE "public"."ab_register_user" ALTER COLUMN "id" SET DEFAULT 
nextval('ab_register_user_id_seq'::regclass);
   > > ALTER TABLE "public"."ab_role" ALTER COLUMN "id" SET DEFAULT 
nextval('ab_role_id_seq'::regclass);
   > > ALTER TABLE "public"."ab_user" ALTER COLUMN "id" SET DEFAULT 
nextval('ab_user_id_seq'::regclass);
   > > ALTER TABLE "public"."ab_user_role" ALTER COLUMN "id" SET DEFAULT 
nextval('ab_user_role_id_seq'::regclass);
   > > ALTER TABLE "public"."ab_view_menu" ALTER COLUMN "id" SET DEFAULT 
nextval('ab_view_menu_id_seq'::regclass);
   > > ```
   > > 
   > > 
   > > 
   > >   
   > > 
   > > 
   > >   
   > > 
   > > 
   > > 
   > >   
   > > Which resolved our issue.
   > 
   > We ran into this issue upgrading from 2.3.1 to 2.4.1 so it doesn't seem 
the issue is fixed yet. These table alterations resolved the problem though.
   
   @ashb These must be run before running `0073_2_0_0_prefix_dag_permissions.py`
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] vincbeck commented on a diff in pull request #27962: Add pre-commits preventing accidental API changes in common.sql

2022-11-28 Thread GitBox


vincbeck commented on code in PR #27962:
URL: https://github.com/apache/airflow/pull/27962#discussion_r1033990823


##
airflow/providers/common/sql/README_API.md:
##
@@ -0,0 +1,103 @@
+
+
+# Keeping the API of common.sql provider backwards compatible
+
+The API of the `common.sql` provider should be kept backwards compatible with 
previously released versions.
+The reason is - there are already released providers that use `common.sql` 
provider and rely on the API and
+behaviour of the `common.sql` provider, and any updates in the API or 
behaviour of it, might potentially
+break those released providers.
+
+Therefore, we should keep an extra care when changing the APIs.
+
+The approach we take is similar to one that has been applied by Android OS 
team to keep their API in check,
+and it is based on storing the current version of API and flagging changes 
that are potentially breaking.
+This is done by comparing the previous API (store in stub files) and the 
upcoming API from the PR.
+The upcoming API is automatically extracted from `common.sql` Python files 
using `update-common-sql-api-stubs`
+pre-commit using mypy `stubgen` and stored as `.pyi` files in the 
`airflow.providers.common.sql` package.
+We also post-process the `.pyi` files to add some historically exposed methods 
that should be also
+considered as public API.
+
+If the comparison determines that the change is potentially breaking, the 
contributor is asked
+to review the changes and manually regenerate the stub files.
+
+The details of the workflow are as follows:
+
+1) The previous API is stored in the (committed to repository) stub files.
+2) Every time when common.sql Python files are modified the 
`update-common-sql-api-stubs` pre-commit
+  regenerates the stubs (including post-processing it) and looks for 
potentially breaking changes
+   (removals or updates of the existing classes/methods).
+3) If the check reveals there are no changes to the API, nothing happens, 
pre-commit succeeds.
+4) If there are only additions, the pre-commit automatically updates the stub 
files,
+   asks the contributor to commit resulting updates and fails the pre-commit. 
This is very similar to
+   other static checks that automatically modify/fix source code.
+5) If the pre-commit detects potentially breaking changes, the process is a 
bit more involved for the
+   contributor. The pre-commit flags such changes to the contributor by 
failing the pre-commit and
+   asks the contributor to review the change looking specifically for breaking 
compatibility with previous
+   providers (and fix any backwards compatibility). Once this is completed, 
the contributor is asked to
+   manually and explicitly regenerate and commit the new version of the stubs 
by running the pre-commit
+   with manually added environment variable:
+
+```shell
+UPDATE_COMMON_SQL_API=1 pre-commit run update-common-sql-api-stubs
+```
+
+# Verifying other providers to use only public API of the `common.sql` provider
+
+MyPy automatically checks if only the methods available in the stubs are used. 
This gives enough protection
+for

Review Comment:
   Still a missing piece here no?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis opened a new pull request, #27970: Replace `unittests` in amazon provider tests by pure `pytest`

2022-11-28 Thread GitBox


Taragolis opened a new pull request, #27970:
URL: https://github.com/apache/airflow/pull/27970

   Migrate Amazon provider's tests to `pytest`.
   
   All changes are more or less straightforward:
   - Get rid of `unittests.TestCase` class and **TestCase.assert*** methods
   - Replace decorator `parameterized.expand` by `pytest.mark.parametrize`. I
   - replace `requests_mock.mock` decorator by `requests_mock` fixture
   - Convert `TestCase.subTest` to parametrize tests  
   - Convert classes **setUp*** and **tearDown*** to [appropriate pytest 
methods](https://docs.pytest.org/en/6.2.x/xunit_setup.html#classic-xunit-style-setup)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on pull request #27962: Add pre-commits preventing accidental API changes in common.sql

2022-11-28 Thread GitBox


potiuk commented on PR #27962:
URL: https://github.com/apache/airflow/pull/27962#issuecomment-1329655368

   > Just a minor comment, other than that, LGTM :)
   
   Fixed and also updated the description a little for the future work.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



  1   2   >