Re: [PR] Add CLI support for bulk pause and resume of DAGs [airflow]

2024-03-18 Thread via GitHub


shahar1 commented on code in PR #38265:
URL: https://github.com/apache/airflow/pull/38265#discussion_r1529768511


##
airflow/cli/commands/dag_command.py:
##
@@ -214,14 +215,37 @@ def dag_unpause(args) -> None:
 @providers_configuration_loaded
 def set_is_paused(is_paused: bool, args) -> None:
 """Set is_paused for DAG by a given dag_id."""
-dag = DagModel.get_dagmodel(args.dag_id)
-
-if not dag:
-raise SystemExit(f"DAG: {args.dag_id} does not exist in 'dag' table")
+should_apply = True
+dags = [
+dag
+for dag in get_dags(args.subdir, dag_id=args.dag_id, 
use_regex=args.treat_dag_as_regex)

Review Comment:
   `--treat-dag-as-regex` is a flag that indicates that `dag_id` should be 
treated as a regex.
   If we consolidate them into one arg (meaning that `dag_id` is always a 
regex), then `dag_id="dag"` could match `dag`, but also `dag-1`, `dag-2`, 
`dag-3`, etc., and it will become an unnecessary breaking change.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add refetch button to audit log, format extra [airflow]

2024-03-18 Thread via GitHub


bbovenzi merged PR #38276:
URL: https://github.com/apache/airflow/pull/38276


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(airflow) branch main updated (99e0d438e8 -> 6934b461d0)

2024-03-18 Thread bbovenzi
This is an automated email from the ASF dual-hosted git repository.

bbovenzi pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from 99e0d438e8 adding: log event for auto pause (#38243)
 add 6934b461d0 Add refetch button to audit log, format extra (#38276)

No new revisions were added by this update.

Summary of changes:
 airflow/www/static/js/components/NewTable/NewCells.tsx |  3 ++-
 airflow/www/static/js/dag/details/AuditLog.tsx | 15 +--
 2 files changed, 15 insertions(+), 3 deletions(-)



Re: [PR] Consolidate HttpOperator http request between sync and async mode [airflow]

2024-03-18 Thread via GitHub


dstandish commented on code in PR #37293:
URL: https://github.com/apache/airflow/pull/37293#discussion_r1529745181


##
airflow/providers/http/hooks/http.py:
##
@@ -385,8 +387,9 @@ async def run(
 for attempt in range(1, 1 + self.retry_limit):
 response = await request_func(
 url,
-json=data if self.method in ("POST", "PUT", "PATCH") else 
None,
 params=data if self.method == "GET" else None,

Review Comment:
   maybe this should be changed too? this seems fishy also.
   
   maybe this is too much change for folks but  it doesn't seem like a good 
API to shuffle things around like this.  i.e. send `data` to params or data or 
json depending on this logic.  it would seem that it would be better to just 
forward everyting blindly.  @uranusjr @jedcunningham any thoughts?



##
airflow/providers/http/hooks/http.py:
##
@@ -385,8 +387,9 @@ async def run(
 for attempt in range(1, 1 + self.retry_limit):
 response = await request_func(
 url,
-json=data if self.method in ("POST", "PUT", "PATCH") else 
None,
 params=data if self.method == "GET" else None,
+data=data if self.method in ("POST", "PUT", "PATCH") else 
None,

Review Comment:
   ```suggestion
   data=data,
   ```
   do we really need that logic?  shouldn't we do as the user asks?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(airflow) branch main updated (e671074137 -> 99e0d438e8)

2024-03-18 Thread eladkal
This is an automated email from the ASF dual-hosted git repository.

eladkal pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from e671074137 Richer Audit Log extra field (#38166)
 add 99e0d438e8 adding: log event for auto pause (#38243)

No new revisions were added by this update.

Summary of changes:
 airflow/models/dagrun.py | 10 ++
 1 file changed, 10 insertions(+)



Re: [PR] adding: log event for auto pause [airflow]

2024-03-18 Thread via GitHub


eladkal merged PR #38243:
URL: https://github.com/apache/airflow/pull/38243


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] adding: log event for auto pause [airflow]

2024-03-18 Thread via GitHub


pateash commented on PR #38243:
URL: https://github.com/apache/airflow/pull/38243#issuecomment-2005785583

   > looks like rebase went wrong
   
   fixed it.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[PR] Rewriting tag_providers.sh in python [airflow]

2024-03-18 Thread via GitHub


poorvirohidekar opened a new pull request, #38278:
URL: https://github.com/apache/airflow/pull/38278

   
   
   
   
   closes: https://github.com/apache/airflow/issues/35057
   Rewriting the tag_providers.sh script to python.
   1. As a part of testing this script I was able to create a Git tag locally, 
correlating with a version denoted by a .whl file. 
   2. I was also able to successfully push this tag, to my forked repository. 
   
   https://github.com/apache/airflow/assets/53335539/87530e43-b5b6-46ee-8f62-86d00305f9a1;>
   
   
   
   ---
   **^ Add meaningful description above**
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a 
newsfragment file, named `{pr_number}.significant.rst` or 
`{issue_number}.significant.rst`, in 
[newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] D105 Check for Metrics Module [airflow]

2024-03-18 Thread via GitHub


amoghrajesh commented on code in PR #38271:
URL: https://github.com/apache/airflow/pull/38271#discussion_r1529703789


##
airflow/metrics/protocols.py:
##
@@ -29,9 +29,13 @@
 class TimerProtocol(Protocol):
 """Type protocol for StatsLogger.timer."""
 
-def __enter__(self) -> Timer: ...
+def __enter__(self) -> Timer:
+"""Enter the timer."""

Review Comment:
   Maintain in sync with other usages.
   `"""Start the timer."""`



##
airflow/metrics/protocols.py:
##
@@ -29,9 +29,13 @@
 class TimerProtocol(Protocol):
 """Type protocol for StatsLogger.timer."""
 
-def __enter__(self) -> Timer: ...
+def __enter__(self) -> Timer:
+"""Enter the timer."""
+...
 
-def __exit__(self, exc_type, exc_value, traceback) -> None: ...
+def __exit__(self, exc_type, exc_value, traceback) -> None:
+"""Exit the timer."""

Review Comment:
   `Stop the timer`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Resolve PT012 in `apache.spark`, `fab`, `ftp`, `openai` and `papermill` providers tests [airflow]

2024-03-18 Thread via GitHub


amoghrajesh commented on PR #38272:
URL: https://github.com/apache/airflow/pull/38272#issuecomment-2005754206

   @Taragolis there are tons modules for these. What do you think about 
creating a long running issue for this, and release it as good first issue for 
the community?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [I] ElasticSearch task_log_handler wide scope index filters causing overload of ES cluster [airflow]

2024-03-18 Thread via GitHub


Owen-CH-Leung commented on issue #37999:
URL: https://github.com/apache/airflow/issues/37999#issuecomment-2005694869

   Yeah second options sounds like a more flexible approach for me too. Happy 
to help more when we have a draft PR to implement this feature.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add CLI support for bulk pause and resume of DAGs [airflow]

2024-03-18 Thread via GitHub


dirrao commented on code in PR #38265:
URL: https://github.com/apache/airflow/pull/38265#discussion_r1529629767


##
airflow/cli/commands/dag_command.py:
##
@@ -214,14 +215,37 @@ def dag_unpause(args) -> None:
 @providers_configuration_loaded
 def set_is_paused(is_paused: bool, args) -> None:
 """Set is_paused for DAG by a given dag_id."""
-dag = DagModel.get_dagmodel(args.dag_id)
-
-if not dag:
-raise SystemExit(f"DAG: {args.dag_id} does not exist in 'dag' table")
+should_apply = True
+dags = [
+dag
+for dag in get_dags(args.subdir, dag_id=args.dag_id, 
use_regex=args.treat_dag_as_regex)

Review Comment:
   I believe dag_id or treat_dag_as_regex args one of them should be set not 
both. Can we add notification for the same?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [I] SparkKubernetesOperator not retrieves logs from the driver pod and displays them in the Airflow UI. [airflow]

2024-03-18 Thread via GitHub


sudohainguyen commented on issue #37681:
URL: https://github.com/apache/airflow/issues/37681#issuecomment-2005591684

   I've been facing the same, should we proceed this issue? so it could be 
resolved in the next release


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(airflow) branch main updated: Richer Audit Log extra field (#38166)

2024-03-18 Thread jedcunningham
This is an automated email from the ASF dual-hosted git repository.

jedcunningham pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new e671074137 Richer Audit Log extra field (#38166)
e671074137 is described below

commit e67107413785fa7ff8e4f0dd89d759120bedfea6
Author: Brent Bovenzi 
AuthorDate: Mon Mar 18 17:58:00 2024 -0700

Richer Audit Log extra field (#38166)
---
 .../api_connexion/endpoints/dataset_endpoint.py|  9 +--
 airflow/www/decorators.py  | 70 ++
 .../endpoints/test_connection_endpoint.py  |  4 +-
 tests/api_connexion/endpoints/test_dag_endpoint.py |  9 +--
 .../endpoints/test_dag_run_endpoint.py | 10 +++-
 .../endpoints/test_dataset_endpoint.py | 24 
 .../endpoints/test_task_instance_endpoint.py   |  6 +-
 .../endpoints/test_variable_endpoint.py| 39 ++--
 tests/test_utils/www.py| 27 +
 tests/www/views/test_views_decorators.py   |  7 +++
 tests/www/views/test_views_paused.py   |  6 +-
 11 files changed, 150 insertions(+), 61 deletions(-)

diff --git a/airflow/api_connexion/endpoints/dataset_endpoint.py 
b/airflow/api_connexion/endpoints/dataset_endpoint.py
index 57c299cade..bfdb8d0a5e 100644
--- a/airflow/api_connexion/endpoints/dataset_endpoint.py
+++ b/airflow/api_connexion/endpoints/dataset_endpoint.py
@@ -46,10 +46,8 @@ from airflow.api_connexion.schemas.dataset_schema import (
 from airflow.datasets import Dataset
 from airflow.datasets.manager import dataset_manager
 from airflow.models.dataset import DatasetDagRunQueue, DatasetEvent, 
DatasetModel
-from airflow.security import permissions
 from airflow.utils import timezone
 from airflow.utils.db import get_query_count
-from airflow.utils.log.action_logger import action_event_from_permission
 from airflow.utils.session import NEW_SESSION, provide_session
 from airflow.www.decorators import action_logging
 from airflow.www.extensions.init_auth_manager import get_auth_manager
@@ -330,12 +328,7 @@ def delete_dataset_queued_events(
 
 @security.requires_access_dataset("POST")
 @provide_session
-@action_logging(
-event=action_event_from_permission(
-prefix=RESOURCE_EVENT_PREFIX,
-permission=permissions.ACTION_CAN_CREATE,
-),
-)
+@action_logging
 def create_dataset_event(session: Session = NEW_SESSION) -> APIResponse:
 """Create dataset event."""
 body = get_json_request_dict()
diff --git a/airflow/www/decorators.py b/airflow/www/decorators.py
index 91146d0fee..3eae5f6239 100644
--- a/airflow/www/decorators.py
+++ b/airflow/www/decorators.py
@@ -44,36 +44,36 @@ def _mask_variable_fields(extra_fields):
 Mask the 'val_content' field if 'key_content' is in the mask list.
 
 The variable requests values and args comes in this form:
-[('key', 'key_content'),('val', 'val_content'), ('description', 
'description_content')]
+{'key': 'key_content', 'val': 'val_content', 'description': 
'description_content'}
 """
-result = []
+result = {}
 keyname = None
-for k, v in extra_fields:
+for k, v in extra_fields.items():
 if k == "key":
 keyname = v
-result.append((k, v))
-elif keyname and k == "val":
+result[k] = v
+elif keyname and (k == "val" or k == "value"):
 x = secrets_masker.redact(v, keyname)
-result.append((k, x))
+result[k] = x
 keyname = None
 else:
-result.append((k, v))
+result[k] = v
 return result
 
 
 def _mask_connection_fields(extra_fields):
 """Mask connection fields."""
-result = []
-for k, v in extra_fields:
-if k == "extra":
+result = {}
+for k, v in extra_fields.items():
+if k == "extra" and v:
 try:
 extra = json.loads(v)
-extra = [(k, secrets_masker.redact(v, k)) for k, v in 
extra.items()]
-result.append((k, json.dumps(dict(extra
+extra = {k: secrets_masker.redact(v, k) for k, v in 
extra.items()}
+result[k] = dict(extra)
 except json.JSONDecodeError:
-result.append((k, "Encountered non-JSON in `extra` field"))
+result[k] = "Encountered non-JSON in `extra` field"
 else:
-result.append((k, secrets_masker.redact(v, k)))
+result[k] = secrets_masker.redact(v, k)
 return result
 
 
@@ -94,35 +94,55 @@ def action_logging(func: T | None = None, event: str | None 
= None) -> T | Calla
 user = get_auth_manager().get_user_name()
 user_display = get_auth_manager().get_user_display_name()
 
-fields_skip_logging = {"csrf_token", "_csrf_token", 
"is_paused"}
-

Re: [PR] Richer Audit Log extra field [airflow]

2024-03-18 Thread via GitHub


jedcunningham merged PR #38166:
URL: https://github.com/apache/airflow/pull/38166


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Disable celery `task_acks_late` [airflow]

2024-03-18 Thread via GitHub


hussein-awala commented on PR #31829:
URL: https://github.com/apache/airflow/pull/31829#issuecomment-2005469245

   Added by #37066


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Disable celery `task_acks_late` [airflow]

2024-03-18 Thread via GitHub


hussein-awala closed pull request #31829: Disable celery `task_acks_late`
URL: https://github.com/apache/airflow/pull/31829


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Disable celery `task_acks_late` [airflow]

2024-03-18 Thread via GitHub


github-actions[bot] commented on PR #31829:
URL: https://github.com/apache/airflow/pull/31829#issuecomment-2005404819

   This pull request has been automatically marked as stale because it has not 
had recent activity. It will be closed in 5 days if no further activity occurs. 
Thank you for your contributions.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [I] Airflow UI marks task as failed but the task still running in kubernetes [airflow]

2024-03-18 Thread via GitHub


github-actions[bot] commented on issue #37800:
URL: https://github.com/apache/airflow/issues/37800#issuecomment-2005404488

   This issue has been automatically marked as stale because it has been open 
for 14 days with no response from the author. It will be closed in next 7 days 
if no further activity occurs from the issue author.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [I] Airflow task fails when no permission to bash [airflow]

2024-03-18 Thread via GitHub


hussein-awala commented on issue #38257:
URL: https://github.com/apache/airflow/issues/38257#issuecomment-2005359468

   Are you using the official docker image? if yes, could you provide the hash 
of the image?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Richer Audit Log extra field [airflow]

2024-03-18 Thread via GitHub


bbovenzi commented on code in PR #38166:
URL: https://github.com/apache/airflow/pull/38166#discussion_r1529467730


##
airflow/www/decorators.py:
##
@@ -94,35 +94,54 @@ def wrapper(*args, **kwargs):
 user = get_auth_manager().get_user_name()
 user_display = get_auth_manager().get_user_display_name()
 
-fields_skip_logging = {"csrf_token", "_csrf_token", 
"is_paused"}
-extra_fields = [
-(k, secrets_masker.redact(v, k))
+isAPIRequest = request.blueprint == "/api/v1"
+hasJsonBody = request.headers.get("content-type") == 
"application/json" and request.json
+
+fields_skip_logging = {
+"csrf_token",
+"_csrf_token",
+"is_paused",
+"dag_id",
+"task_id",
+"dag_run_id",
+"run_id",
+"execution_date",
+}
+extra_fields = {
+k: secrets_masker.redact(v, k)
 for k, v in 
itertools.chain(request.values.items(multi=True), request.view_args.items())
 if k not in fields_skip_logging
-]
+}
 if event and event.startswith("variable."):
-extra_fields = _mask_variable_fields(extra_fields)
-if event and event.startswith("connection."):
-extra_fields = _mask_connection_fields(extra_fields)
+extra_fields = _mask_variable_fields(
+request.json if isAPIRequest and hasJsonBody else 
extra_fields
+)
+elif event and event.startswith("connection."):
+extra_fields = _mask_connection_fields(
+request.json if isAPIRequest and hasJsonBody else 
extra_fields
+)
+elif hasJsonBody:
+extra_fields = {**extra_fields, **request.json}

Review Comment:
   Added and with a test of `dataset_event.extra`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Consolidate HttpOperator http request between sync and async mode [airflow]

2024-03-18 Thread via GitHub


hussein-awala commented on code in PR #37293:
URL: https://github.com/apache/airflow/pull/37293#discussion_r1529458809


##
airflow/providers/http/hooks/http.py:
##
@@ -320,6 +320,7 @@ async def run(
 data: dict[str, Any] | str | None = None,
 headers: dict[str, Any] | None = None,
 extra_options: dict[str, Any] | None = None,
+data_type: Literal["json", "dict"] = "json",

Review Comment:
   I like the idea of having two parameters one for data and other for json, 
let me try it.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[PR] Apply D105 to the Models Module Partly [airflow]

2024-03-18 Thread via GitHub


Satoshi-Sh opened a new pull request, #38277:
URL: https://github.com/apache/airflow/pull/38277

   ## Related Issue 
   #37523 
   ##Checks 
   - [x] airflow/models/abstractoperator.py
   - [x] airflow/models/baseoperator.py
   - [x] airflow/models/connection.py
   - [x] airflow/models/dag.py
   - [x] airflow/models/dagrun.py
   - [x] airflow/models/dagwarning.py
   - [x] airflow/models/dataset.py
   - [x] airflow/models/expandinput.py
   - [x] airflow/models/log.py
   - [x] airflow/models/mappedoperator.py
   
   
   
   
   
   
   
   
   
   ---
   **^ Add meaningful description above**
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a 
newsfragment file, named `{pr_number}.significant.rst` or 
`{issue_number}.significant.rst`, in 
[newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[PR] Add refetch button to audit log, format extra [airflow]

2024-03-18 Thread via GitHub


bbovenzi opened a new pull request, #38276:
URL: https://github.com/apache/airflow/pull/38276

   Once we switch the audit log extra to json we should format it correctly. 
Also, it is nice to have a manual refetch button.
   
   Before:
   https://github.com/apache/airflow/assets/4600967/30f47dd4-d5cf-488d-a6b2-65b6ec9ffda2;>
   
   
   After:
   https://github.com/apache/airflow/assets/4600967/2eb5ac89-4fc8-4672-a05a-480370bd5c81;>
   
   
   
   
   ---
   **^ Add meaningful description above**
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a 
newsfragment file, named `{pr_number}.significant.rst` or 
`{issue_number}.significant.rst`, in 
[newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Richer Audit Log extra field [airflow]

2024-03-18 Thread via GitHub


jedcunningham commented on code in PR #38166:
URL: https://github.com/apache/airflow/pull/38166#discussion_r1529417066


##
airflow/www/decorators.py:
##
@@ -94,35 +94,54 @@ def wrapper(*args, **kwargs):
 user = get_auth_manager().get_user_name()
 user_display = get_auth_manager().get_user_display_name()
 
-fields_skip_logging = {"csrf_token", "_csrf_token", 
"is_paused"}
-extra_fields = [
-(k, secrets_masker.redact(v, k))
+isAPIRequest = request.blueprint == "/api/v1"
+hasJsonBody = request.headers.get("content-type") == 
"application/json" and request.json
+
+fields_skip_logging = {
+"csrf_token",
+"_csrf_token",
+"is_paused",
+"dag_id",
+"task_id",
+"dag_run_id",
+"run_id",
+"execution_date",
+}
+extra_fields = {
+k: secrets_masker.redact(v, k)
 for k, v in 
itertools.chain(request.values.items(multi=True), request.view_args.items())
 if k not in fields_skip_logging
-]
+}
 if event and event.startswith("variable."):
-extra_fields = _mask_variable_fields(extra_fields)
-if event and event.startswith("connection."):
-extra_fields = _mask_connection_fields(extra_fields)
+extra_fields = _mask_variable_fields(
+request.json if isAPIRequest and hasJsonBody else 
extra_fields
+)
+elif event and event.startswith("connection."):
+extra_fields = _mask_connection_fields(
+request.json if isAPIRequest and hasJsonBody else 
extra_fields
+)
+elif hasJsonBody:
+extra_fields = {**extra_fields, **request.json}

Review Comment:
   Do we need to mask `request.json`?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Delete all old dag pages and redirect to grid view [airflow]

2024-03-18 Thread via GitHub


bbovenzi commented on PR #37988:
URL: https://github.com/apache/airflow/pull/37988#issuecomment-2005196310

   > I was hoping that the last two pages are also replaced with the last 
commit as notes in [#37988 
(review)](https://github.com/apache/airflow/pull/37988#pullrequestreview-1932895524)
 - but LGTM before having this waiting until 2.10 - leftovers can also be made 
in a follow-up PR...
   
   Which last two pages are you referring to? Finishing off the task specific 
ones like `rendered_templates`? That will be a follow-up PR.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Richer Audit Log extra field [airflow]

2024-03-18 Thread via GitHub


bbovenzi commented on PR #38166:
URL: https://github.com/apache/airflow/pull/38166#issuecomment-2005191892

   Added `expected_extra` as an optional field for our `_check_last_log()` test 
function.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Consolidate HttpOperator http request between sync and async mode [airflow]

2024-03-18 Thread via GitHub


dstandish commented on code in PR #37293:
URL: https://github.com/apache/airflow/pull/37293#discussion_r1529377720


##
airflow/providers/http/hooks/http.py:
##
@@ -320,6 +320,7 @@ async def run(
 data: dict[str, Any] | str | None = None,
 headers: dict[str, Any] | None = None,
 extra_options: dict[str, Any] | None = None,
+data_type: Literal["json", "dict"] = "json",

Review Comment:
   yeah i think your first attempt was the best move: 
   
https://github.com/apache/airflow/pull/37293/commits/01fee3ef57b28b6c5cbe551e982e04278df7a7ef
   
   it was just a bug. json should go to json, and data to data, no?
   
   you can make breaking changes cus it's provider and just bump major



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Consolidate HttpOperator http request between sync and async mode [airflow]

2024-03-18 Thread via GitHub


dstandish commented on code in PR #37293:
URL: https://github.com/apache/airflow/pull/37293#discussion_r1529377720


##
airflow/providers/http/hooks/http.py:
##
@@ -320,6 +320,7 @@ async def run(
 data: dict[str, Any] | str | None = None,
 headers: dict[str, Any] | None = None,
 extra_options: dict[str, Any] | None = None,
+data_type: Literal["json", "dict"] = "json",

Review Comment:
   yeah i think your first attempt was the best move: 
   
https://github.com/apache/airflow/pull/37293/commits/01fee3ef57b28b6c5cbe551e982e04278df7a7ef
   
   it was just a bug. json should go to json, and data to data, no?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Consolidate HttpOperator http request between sync and async mode [airflow]

2024-03-18 Thread via GitHub


dstandish commented on code in PR #37293:
URL: https://github.com/apache/airflow/pull/37293#discussion_r1529374000


##
airflow/providers/http/hooks/http.py:
##
@@ -320,6 +320,7 @@ async def run(
 data: dict[str, Any] | str | None = None,
 headers: dict[str, Any] | None = None,
 extra_options: dict[str, Any] | None = None,
+data_type: Literal["json", "dict"] = "json",

Review Comment:
   hmm 
   why don't you just have params `json` and `data` and use whichever one you 
get?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(airflow) 07/07: Fix filtered TI links (#38274)

2024-03-18 Thread jedcunningham
This is an automated email from the ASF dual-hosted git repository.

jedcunningham pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit bcdd076200a4caf1f7f1533e1d2f5dddaff2fb56
Author: Jed Cunningham <66968678+jedcunning...@users.noreply.github.com>
AuthorDate: Mon Mar 18 16:17:58 2024 -0600

Fix filtered TI links (#38274)

We have links to the graph, filtered down to a specific TI. This was
accidentally broken in #38096.

(cherry picked from commit 8ea95e778a1a0b90aaeecd156d6924915a6f1c27)
---
 airflow/www/utils.py | 1 +
 1 file changed, 1 insertion(+)

diff --git a/airflow/www/utils.py b/airflow/www/utils.py
index 389d9e25ca..6f7f9c5945 100644
--- a/airflow/www/utils.py
+++ b/airflow/www/utils.py
@@ -435,6 +435,7 @@ def task_instance_link(attr):
 "Airflow.grid",
 dag_id=dag_id,
 task_id=task_id,
+root=task_id,
 dag_run_id=run_id,
 tab="graph",
 map_index=attr.get("map_index", -1),



(airflow) 05/07: Fix excessive permission changing for log task handler (#38164)

2024-03-18 Thread jedcunningham
This is an automated email from the ASF dual-hosted git repository.

jedcunningham pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit e6ab8fc158d377f1593fdddf7dd35a9200964d35
Author: Jarek Potiuk 
AuthorDate: Mon Mar 18 19:28:37 2024 +0100

Fix excessive permission changing for log task handler (#38164)

When log dir permission was changed by log handler, we've
implemented also changing permissions of parent folders recursively,
however it was quite a bit too much to change it for home directory
where the log folder could have been created - because in some cases
changing permissions might lead to unexpected side-effects - such
as loosing ability to login to ssh server.

Fixes: #38137
(cherry picked from commit 2c1d0f8c4121589e92e4c0ca1665b89a2691d2d8)
---
 airflow/utils/log/file_task_handler.py | 41 --
 tests/utils/test_log_handlers.py   | 28 ++-
 2 files changed, 21 insertions(+), 48 deletions(-)

diff --git a/airflow/utils/log/file_task_handler.py 
b/airflow/utils/log/file_task_handler.py
index 7e1faca4ec..e6dd540267 100644
--- a/airflow/utils/log/file_task_handler.py
+++ b/airflow/utils/log/file_task_handler.py
@@ -159,31 +159,6 @@ def _ensure_ti(ti: TaskInstanceKey | TaskInstance, 
session) -> TaskInstance:
 raise AirflowException(f"Could not find TaskInstance for {ti}")
 
 
-def _change_directory_permissions_up(directory: Path, folder_permissions: int):
-"""
-Change permissions of the given directory and its parents.
-
-Only attempt to change permissions for directories owned by the current 
user.
-
-:param directory: directory to change permissions of (including parents)
-:param folder_permissions: permissions to set
-"""
-if directory.stat().st_uid == os.getuid():
-if directory.stat().st_mode % 0o1000 != folder_permissions % 0o1000:
-print(f"Changing {directory} permission to {folder_permissions}")
-try:
-directory.chmod(folder_permissions)
-except PermissionError as e:
-# In some circumstances (depends on user and filesystem) we 
might not be able to
-# change the permission for the folder (when the folder was 
created by another user
-# before or when the filesystem does not allow to change 
permission). We should not
-# fail in this case but rather ignore it.
-print(f"Failed to change {directory} permission to 
{folder_permissions}: {e}")
-return
-if directory.parent != directory:
-_change_directory_permissions_up(directory.parent, 
folder_permissions)
-
-
 class FileTaskHandler(logging.Handler):
 """
 FileTaskHandler is a python log handler that handles and reads task 
instance logs.
@@ -481,7 +456,8 @@ class FileTaskHandler(logging.Handler):
 
 return logs, metadata_array
 
-def _prepare_log_folder(self, directory: Path):
+@staticmethod
+def _prepare_log_folder(directory: Path, new_folder_permissions: int):
 """
 Prepare the log folder and ensure its mode is as configured.
 
@@ -505,11 +481,9 @@ class FileTaskHandler(logging.Handler):
 sure that the same group is set as default group for both - 
impersonated user and main airflow
 user.
 """
-new_folder_permissions = int(
-conf.get("logging", "file_task_handler_new_folder_permissions", 
fallback="0o775"), 8
-)
-directory.mkdir(mode=new_folder_permissions, parents=True, 
exist_ok=True)
-_change_directory_permissions_up(directory, new_folder_permissions)
+for parent in reversed(directory.parents):
+parent.mkdir(mode=new_folder_permissions, exist_ok=True)
+directory.mkdir(mode=new_folder_permissions, exist_ok=True)
 
 def _init_file(self, ti, *, identifier: str | None = None):
 """
@@ -531,7 +505,10 @@ class FileTaskHandler(logging.Handler):
 # if this is true, we're invoked via set_context in the context of
 # setting up individual trigger logging. return trigger log path.
 full_path = self.add_triggerer_suffix(full_path=full_path, 
job_id=ti.triggerer_job.id)
-self._prepare_log_folder(Path(full_path).parent)
+new_folder_permissions = int(
+conf.get("logging", "file_task_handler_new_folder_permissions", 
fallback="0o775"), 8
+)
+self._prepare_log_folder(Path(full_path).parent, 
new_folder_permissions)
 
 if not os.path.exists(full_path):
 open(full_path, "a").close()
diff --git a/tests/utils/test_log_handlers.py b/tests/utils/test_log_handlers.py
index 36810506da..6017c761c9 100644
--- a/tests/utils/test_log_handlers.py
+++ b/tests/utils/test_log_handlers.py
@@ -22,7 +22,6 @@ import logging.config
 import os
 import re
 import shutil

(airflow) 06/07: bump `croniter` to fix an issue with 29 Feb cron expressions (#38198)

2024-03-18 Thread jedcunningham
This is an automated email from the ASF dual-hosted git repository.

jedcunningham pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 62f636f0d4dc410e9dd366cc605477bbd520c821
Author: Hussein Awala 
AuthorDate: Sat Mar 16 01:08:40 2024 +0100

bump `croniter` to fix an issue with 29 Feb cron expressions (#38198)

(cherry picked from commit 2e3b1758246169b278e7d409ecff98532ec18c18)
---
 pyproject.toml   |  2 +-
 tests/models/test_dag.py | 13 +
 2 files changed, 14 insertions(+), 1 deletion(-)

diff --git a/pyproject.toml b/pyproject.toml
index 001dd86d37..2a3cac3008 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -87,7 +87,7 @@ dependencies = [
 # This limit can be removed after 
https://github.com/apache/airflow/issues/35234 is fixed
 "connexion[flask]>=2.10.0,<3.0",
 "cron-descriptor>=1.2.24",
-"croniter>=0.3.17",
+"croniter>=2.0.2",
 "cryptography>=0.9.3",
 "deprecated>=1.2.13",
 "dill>=0.2.2",
diff --git a/tests/models/test_dag.py b/tests/models/test_dag.py
index 152bed8e94..31a7460704 100644
--- a/tests/models/test_dag.py
+++ b/tests/models/test_dag.py
@@ -2453,6 +2453,19 @@ my_postgres_conn:
 next_subdag_info = subdag.next_dagrun_info(None)
 assert next_subdag_info is None, "SubDags should never have DagRuns 
created by the scheduler"
 
+def test_next_dagrun_info_on_29_feb(self):
+dag = DAG(
+"test_scheduler_dagrun_29_feb", start_date=timezone.datetime(2024, 
1, 1), schedule="0 0 29 2 *"
+)
+
+next_info = dag.next_dagrun_info(None)
+assert next_info and next_info.logical_date == timezone.datetime(2024, 
2, 29)
+
+next_info = dag.next_dagrun_info(next_info.data_interval)
+assert next_info.logical_date == timezone.datetime(2028, 2, 29)
+assert next_info.data_interval.start == timezone.datetime(2028, 2, 29)
+assert next_info.data_interval.end == timezone.datetime(2032, 2, 29)
+
 def test_replace_outdated_access_control_actions(self):
 outdated_permissions = {
 "role1": {permissions.ACTION_CAN_READ, 
permissions.ACTION_CAN_EDIT},



(airflow) branch v2-8-test updated (19d684d3ce -> bcdd076200)

2024-03-18 Thread jedcunningham
This is an automated email from the ASF dual-hosted git repository.

jedcunningham pushed a change to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git


from 19d684d3ce Update documentation about recent image changes (#38085)
 new 07b32f04fe Fix hash caching in `ObjectStoragePath` (#37769)
 new c44f5396be Add padding to prevent grid horizontal scroll overlapping 
tasks (#37942)
 new 8c4f10427c Fix a bug where scheduler heartrate parameter were not used 
(#37992)
 new 0e60195448 Fix task instances list link (#38096)
 new e6ab8fc158 Fix excessive permission changing for log task handler 
(#38164)
 new 62f636f0d4 bump `croniter` to fix an issue with 29 Feb cron 
expressions (#38198)
 new bcdd076200 Fix filtered TI links (#38274)

The 7 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 airflow/io/path.py   |  6 ++---
 airflow/jobs/job.py  |  2 ++
 airflow/jobs/scheduler_job_runner.py |  1 -
 airflow/utils/log/file_task_handler.py   | 41 +++-
 airflow/www/static/js/dag/grid/index.tsx |  1 +
 airflow/www/utils.py | 11 +++--
 pyproject.toml   |  2 +-
 tests/io/test_path.py|  9 +++
 tests/jobs/test_base_job.py  |  9 ---
 tests/jobs/test_scheduler_job.py | 10 
 tests/models/test_dag.py | 13 ++
 tests/utils/test_log_handlers.py | 28 ++
 12 files changed, 75 insertions(+), 58 deletions(-)



(airflow) 04/07: Fix task instances list link (#38096)

2024-03-18 Thread jedcunningham
This is an automated email from the ASF dual-hosted git repository.

jedcunningham pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 0e60195448d6a8b0dca6a72e279b1e41efe37c35
Author: Brent Bovenzi 
AuthorDate: Wed Mar 13 09:46:07 2024 -0400

Fix task instances list link (#38096)

(cherry picked from commit 43d7f6d3da0192c3abd07c9f0dab54e6da605621)
---
 airflow/www/utils.py | 10 --
 1 file changed, 8 insertions(+), 2 deletions(-)

diff --git a/airflow/www/utils.py b/airflow/www/utils.py
index 16d0950989..389d9e25ca 100644
--- a/airflow/www/utils.py
+++ b/airflow/www/utils.py
@@ -422,6 +422,7 @@ def task_instance_link(attr):
 """Generate a URL to the Graph view for a TaskInstance."""
 dag_id = attr.get("dag_id")
 task_id = attr.get("task_id")
+run_id = attr.get("run_id")
 execution_date = attr.get("dag_run.execution_date") or 
attr.get("execution_date") or timezone.utcnow()
 url = url_for(
 "Airflow.task",
@@ -431,13 +432,18 @@ def task_instance_link(attr):
 map_index=attr.get("map_index", -1),
 )
 url_root = url_for(
-"Airflow.graph", dag_id=dag_id, root=task_id, 
execution_date=execution_date.isoformat()
+"Airflow.grid",
+dag_id=dag_id,
+task_id=task_id,
+dag_run_id=run_id,
+tab="graph",
+map_index=attr.get("map_index", -1),
 )
 return Markup(
 """
 
 {task_id}
-
+
 filter_alt
 



(airflow) 02/07: Add padding to prevent grid horizontal scroll overlapping tasks (#37942)

2024-03-18 Thread jedcunningham
This is an automated email from the ASF dual-hosted git repository.

jedcunningham pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit c44f5396be0b2d7a01cc08ccf03cf29fbf1c8b48
Author: Brent Bovenzi 
AuthorDate: Wed Mar 6 14:02:02 2024 -0500

Add padding to prevent grid horizontal scroll overlapping tasks (#37942)

(cherry picked from commit 0c9487f9466587f6cbc443ab2c9dc17d35c98f5d)
---
 airflow/www/static/js/dag/grid/index.tsx | 1 +
 1 file changed, 1 insertion(+)

diff --git a/airflow/www/static/js/dag/grid/index.tsx 
b/airflow/www/static/js/dag/grid/index.tsx
index b7299e5ba3..079837d087 100644
--- a/airflow/www/static/js/dag/grid/index.tsx
+++ b/airflow/www/static/js/dag/grid/index.tsx
@@ -167,6 +167,7 @@ const Grid = ({
 position="relative"
 mt={8}
 overscrollBehavior="auto"
+pb={4}
   >
 
   



(airflow) 03/07: Fix a bug where scheduler heartrate parameter were not used (#37992)

2024-03-18 Thread jedcunningham
This is an automated email from the ASF dual-hosted git repository.

jedcunningham pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 8c4f10427cac2c679bdab5063d1fde84c88a5f5b
Author: Jarek Potiuk 
AuthorDate: Fri Mar 8 14:31:46 2024 +0100

Fix a bug where scheduler heartrate parameter were not used (#37992)

Sinc #30255 scheduler heartrate has not been properly calculated.
We missed the check for SchedulerJob type and setting heartrate
value from `scheduler_health_check_threshold`.

This PR fixes it.

Fix: #37971
(cherry picked from commit 01e40abc759a219cc359fab6ca73434cca55901a)
---
 airflow/jobs/job.py  |  2 ++
 airflow/jobs/scheduler_job_runner.py |  1 -
 tests/jobs/test_base_job.py  |  9 ++---
 tests/jobs/test_scheduler_job.py | 10 ++
 4 files changed, 18 insertions(+), 4 deletions(-)

diff --git a/airflow/jobs/job.py b/airflow/jobs/job.py
index 0afbb2a026..0bd81e0c80 100644
--- a/airflow/jobs/job.py
+++ b/airflow/jobs/job.py
@@ -264,6 +264,8 @@ class Job(Base, LoggingMixin):
 def _heartrate(job_type: str) -> float:
 if job_type == "TriggererJob":
 return conf.getfloat("triggerer", "JOB_HEARTBEAT_SEC")
+elif job_type == "SchedulerJob":
+return conf.getfloat("scheduler", "SCHEDULER_HEARTBEAT_SEC")
 else:
 # Heartrate used to be hardcoded to scheduler, so in all other
 # cases continue to use that value for back compat
diff --git a/airflow/jobs/scheduler_job_runner.py 
b/airflow/jobs/scheduler_job_runner.py
index bf5a28cb5b..eddd742571 100644
--- a/airflow/jobs/scheduler_job_runner.py
+++ b/airflow/jobs/scheduler_job_runner.py
@@ -151,7 +151,6 @@ class SchedulerJobRunner(BaseJobRunner, LoggingMixin):
 """
 
 job_type = "SchedulerJob"
-heartrate: int = conf.getint("scheduler", "SCHEDULER_HEARTBEAT_SEC")
 
 def __init__(
 self,
diff --git a/tests/jobs/test_base_job.py b/tests/jobs/test_base_job.py
index 8f7237ffc6..62e0369791 100644
--- a/tests/jobs/test_base_job.py
+++ b/tests/jobs/test_base_job.py
@@ -97,9 +97,12 @@ class TestJob:
 )
 def test_heart_rate_after_fetched_from_db(self, job_runner, job_type, 
job_heartbeat_sec):
 """Ensure heartrate is set correctly after jobs are queried from the 
DB"""
-with create_session() as session, conf_vars(
-{(job_type.lower(), "job_heartbeat_sec"): job_heartbeat_sec}
-):
+if job_type == "scheduler":
+config_name = "scheduler_heartbeat_sec"
+else:
+config_name = "job_heartbeat_sec"
+
+with create_session() as session, conf_vars({(job_type.lower(), 
config_name): job_heartbeat_sec}):
 job = Job()
 job_runner(job=job)
 session.add(job)
diff --git a/tests/jobs/test_scheduler_job.py b/tests/jobs/test_scheduler_job.py
index 8e745e1e08..e5de64ee9d 100644
--- a/tests/jobs/test_scheduler_job.py
+++ b/tests/jobs/test_scheduler_job.py
@@ -193,6 +193,16 @@ class TestSchedulerJob:
 not scheduler_job.is_alive()
 ), "Completed jobs even with recent heartbeat should not be alive"
 
+@pytest.mark.parametrize(
+"heartrate",
+[10, 5],
+)
+def test_heartrate(self, heartrate):
+with conf_vars({("scheduler", "scheduler_heartbeat_sec"): 
str(heartrate)}):
+scheduler_job = Job(executor=self.null_exec)
+_ = SchedulerJobRunner(job=scheduler_job)
+assert scheduler_job.heartrate == heartrate
+
 def run_single_scheduler_loop_with_no_dags(self, dags_folder):
 """
 Utility function that runs a single scheduler loop without actually



(airflow) 01/07: Fix hash caching in `ObjectStoragePath` (#37769)

2024-03-18 Thread jedcunningham
This is an automated email from the ASF dual-hosted git repository.

jedcunningham pushed a commit to branch v2-8-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 07b32f04fe1dc5b568f954e68960ecba50a71f80
Author: Andrey Anshin 
AuthorDate: Wed Feb 28 17:05:58 2024 +0400

Fix hash caching in `ObjectStoragePath` (#37769)

(cherry picked from commit 382a9618ed15403766ec7d1bcbe41ed955b3e60d)
---
 airflow/io/path.py| 6 +++---
 tests/io/test_path.py | 9 +
 2 files changed, 12 insertions(+), 3 deletions(-)

diff --git a/airflow/io/path.py b/airflow/io/path.py
index bd7c320653..40a9e49e52 100644
--- a/airflow/io/path.py
+++ b/airflow/io/path.py
@@ -17,7 +17,6 @@
 from __future__ import annotations
 
 import contextlib
-import functools
 import os
 import shutil
 import typing
@@ -149,9 +148,10 @@ class ObjectStoragePath(CloudPath):
 
 return cls._from_parts(args_list, url=parsed_url, conn_id=conn_id, 
**kwargs)  # type: ignore
 
-@functools.lru_cache
 def __hash__(self) -> int:
-return hash(str(self))
+if not (_hash := getattr(self, "_hash", None)):
+self._hash = _hash = hash(str(self))
+return _hash
 
 def __eq__(self, other: typing.Any) -> bool:
 return self.samestore(other) and str(self) == str(other)
diff --git a/tests/io/test_path.py b/tests/io/test_path.py
index 7c97c7b2b9..7f250825ce 100644
--- a/tests/io/test_path.py
+++ b/tests/io/test_path.py
@@ -306,3 +306,12 @@ class TestFs:
 finally:
 # Reset the cache to avoid side effects
 _register_filesystems.cache_clear()
+
+def test_hash(self):
+file_uri_1 = f"file:///tmp/{str(uuid.uuid4())}"
+file_uri_2 = f"file:///tmp/{str(uuid.uuid4())}"
+s = set()
+for _ in range(10):
+s.add(ObjectStoragePath(file_uri_1))
+s.add(ObjectStoragePath(file_uri_2))
+assert len(s) == 2



Re: [PR] Fix filtered TI links [airflow]

2024-03-18 Thread via GitHub


jedcunningham merged PR #38274:
URL: https://github.com/apache/airflow/pull/38274


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(airflow) branch main updated (a2c38ea336 -> 8ea95e778a)

2024-03-18 Thread jedcunningham
This is an automated email from the ASF dual-hosted git repository.

jedcunningham pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from a2c38ea336 Fix missing reverse quote in docs (#38275)
 add 8ea95e778a Fix filtered TI links (#38274)

No new revisions were added by this update.

Summary of changes:
 airflow/www/utils.py | 1 +
 1 file changed, 1 insertion(+)



Re: [PR] Add a task instance dependency for mapped dependencies (#37091) [airflow]

2024-03-18 Thread via GitHub


stevenschaerer commented on code in PR #37498:
URL: https://github.com/apache/airflow/pull/37498#discussion_r1529352990


##
airflow/ti_deps/deps/mapped_task_upstream_dep.py:
##
@@ -0,0 +1,92 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from collections.abc import Iterator
+from typing import TYPE_CHECKING
+
+from airflow.ti_deps.deps.base_ti_dep import BaseTIDep
+from airflow.utils.state import State, TaskInstanceState
+
+if TYPE_CHECKING:
+from sqlalchemy.orm import Session
+
+from airflow.models.taskinstance import TaskInstance
+from airflow.ti_deps.dep_context import DepContext
+from airflow.ti_deps.deps.base_ti_dep import TIDepStatus
+
+
+class MappedTaskUpstreamDep(BaseTIDep):
+"""
+Determines if a mapped task's upstream tasks that provide XComs used by 
this task for task mapping are in
+a state that allows a given task instance to run.
+"""

Review Comment:
   I actually had to rework this a bit to satisfy ruff. Please have a look.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add a task instance dependency for mapped dependencies (#37091) [airflow]

2024-03-18 Thread via GitHub


stevenschaerer commented on code in PR #37498:
URL: https://github.com/apache/airflow/pull/37498#discussion_r1529335133


##
airflow/ti_deps/deps/mapped_task_upstream_dep.py:
##
@@ -0,0 +1,92 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from collections.abc import Iterator
+from typing import TYPE_CHECKING
+
+from airflow.ti_deps.deps.base_ti_dep import BaseTIDep
+from airflow.utils.state import State, TaskInstanceState
+
+if TYPE_CHECKING:
+from sqlalchemy.orm import Session
+
+from airflow.models.taskinstance import TaskInstance
+from airflow.ti_deps.dep_context import DepContext
+from airflow.ti_deps.deps.base_ti_dep import TIDepStatus
+
+
+class MappedTaskUpstreamDep(BaseTIDep):
+"""
+Determines if a mapped task's upstream tasks that provide XComs used by 
this task for task mapping are in
+a state that allows a given task instance to run.
+"""
+
+NAME = "Mapped dependencies have succeeded"
+IGNORABLE = True
+IS_TASK_DEP = True
+
+def _get_dep_statuses(
+self,
+ti: TaskInstance,
+session: Session,
+dep_context: DepContext,
+) -> Iterator[TIDepStatus]:
+from airflow.models.mappedoperator import MappedOperator
+
+if isinstance(ti.task, MappedOperator):
+mapped_dependencies = ti.task.iter_mapped_dependencies()
+elif (task_group := ti.task.get_closest_mapped_task_group()) is not 
None:

Review Comment:
   I was thinking about this, but realized that nested mapped task groups and 
mapped operators within mapped task groups are currently not supported. I now 
added a couple of unit tests that verify that this is not the case. This way, 
if they end up being supported in the future there will be failing tests 
indicating that some work needs to be done.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(airflow) branch main updated: Fix missing reverse quote in docs (#38275)

2024-03-18 Thread taragolis
This is an automated email from the ASF dual-hosted git repository.

taragolis pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new a2c38ea336 Fix missing reverse quote in docs (#38275)
a2c38ea336 is described below

commit a2c38ea33632d930c1f2d5dcc68697a5d6da1e03
Author: Jarek Potiuk 
AuthorDate: Mon Mar 18 23:06:55 2024 +0100

Fix missing reverse quote in docs (#38275)

Introduced in #38262
---
 airflow/providers/common/io/CHANGELOG.rst | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/airflow/providers/common/io/CHANGELOG.rst 
b/airflow/providers/common/io/CHANGELOG.rst
index cd867f7637..78ccaa195c 100644
--- a/airflow/providers/common/io/CHANGELOG.rst
+++ b/airflow/providers/common/io/CHANGELOG.rst
@@ -79,6 +79,6 @@ Bug Fixes
 1.0.0 (YANKED)
 ..
 
-.. warning:: This release has been **yanked** with a reason: ``Used older 
interface from 2.8.0.dev0 versions`
+.. warning:: This release has been **yanked** with a reason: ``Used older 
interface from 2.8.0.dev0 versions``
 
 Initial version of the provider.



Re: [PR] Fix missing reverse quote in docs [airflow]

2024-03-18 Thread via GitHub


Taragolis merged PR #38275:
URL: https://github.com/apache/airflow/pull/38275


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add a task instance dependency for mapped dependencies (#37091) [airflow]

2024-03-18 Thread via GitHub


stevenschaerer commented on code in PR #37498:
URL: https://github.com/apache/airflow/pull/37498#discussion_r1529344008


##
airflow/ti_deps/deps/mapped_task_upstream_dep.py:
##
@@ -0,0 +1,92 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from collections.abc import Iterator
+from typing import TYPE_CHECKING
+
+from airflow.ti_deps.deps.base_ti_dep import BaseTIDep
+from airflow.utils.state import State, TaskInstanceState
+
+if TYPE_CHECKING:
+from sqlalchemy.orm import Session
+
+from airflow.models.taskinstance import TaskInstance
+from airflow.ti_deps.dep_context import DepContext
+from airflow.ti_deps.deps.base_ti_dep import TIDepStatus
+
+
+class MappedTaskUpstreamDep(BaseTIDep):
+"""
+Determines if a mapped task's upstream tasks that provide XComs used by 
this task for task mapping are in
+a state that allows a given task instance to run.
+"""
+
+NAME = "Mapped dependencies have succeeded"
+IGNORABLE = True
+IS_TASK_DEP = True
+
+def _get_dep_statuses(
+self,
+ti: TaskInstance,
+session: Session,
+dep_context: DepContext,
+) -> Iterator[TIDepStatus]:
+from airflow.models.mappedoperator import MappedOperator
+
+if isinstance(ti.task, MappedOperator):
+mapped_dependencies = ti.task.iter_mapped_dependencies()
+elif (task_group := ti.task.get_closest_mapped_task_group()) is not 
None:
+mapped_dependencies = task_group.iter_mapped_dependencies()
+else:
+return
+
+mapped_dependency_tis = [
+ti.get_dagrun(session).get_task_instance(operator.task_id, 
session=session)
+for operator in mapped_dependencies
+]

Review Comment:
   Dependencies that are themselves mapped are tested in test_step_by_step - m2 
depends on m1.
   
   Once the dependency is expanded, `get_task_instance` returns None, which is 
filtered out below. My thinking was that if a task is expanded, it couldn't 
have failed or been skipped altogether so for the purpose of this dependency 
check this is fine. Am I wrong here?
   
   How exactly would this query you suggest look like?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add a task instance dependency for mapped dependencies (#37091) [airflow]

2024-03-18 Thread via GitHub


stevenschaerer commented on code in PR #37498:
URL: https://github.com/apache/airflow/pull/37498#discussion_r1529335133


##
airflow/ti_deps/deps/mapped_task_upstream_dep.py:
##
@@ -0,0 +1,92 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from collections.abc import Iterator
+from typing import TYPE_CHECKING
+
+from airflow.ti_deps.deps.base_ti_dep import BaseTIDep
+from airflow.utils.state import State, TaskInstanceState
+
+if TYPE_CHECKING:
+from sqlalchemy.orm import Session
+
+from airflow.models.taskinstance import TaskInstance
+from airflow.ti_deps.dep_context import DepContext
+from airflow.ti_deps.deps.base_ti_dep import TIDepStatus
+
+
+class MappedTaskUpstreamDep(BaseTIDep):
+"""
+Determines if a mapped task's upstream tasks that provide XComs used by 
this task for task mapping are in
+a state that allows a given task instance to run.
+"""
+
+NAME = "Mapped dependencies have succeeded"
+IGNORABLE = True
+IS_TASK_DEP = True
+
+def _get_dep_statuses(
+self,
+ti: TaskInstance,
+session: Session,
+dep_context: DepContext,
+) -> Iterator[TIDepStatus]:
+from airflow.models.mappedoperator import MappedOperator
+
+if isinstance(ti.task, MappedOperator):
+mapped_dependencies = ti.task.iter_mapped_dependencies()
+elif (task_group := ti.task.get_closest_mapped_task_group()) is not 
None:

Review Comment:
   I was thinking about this, but realized that nested mapped task groups and 
mapped operators within mapped task groups are currently not supported? I now 
added a couple of unit tests that verify that this is not the case. This way, 
if they end up being supported in the future there will be failing tests 
indicating that some work needs to be done.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[PR] Fix missing reverse quote in docs [airflow]

2024-03-18 Thread via GitHub


potiuk opened a new pull request, #38275:
URL: https://github.com/apache/airflow/pull/38275

   Introduced in #38262
   
   
   
   
   
   
   
   
   ---
   **^ Add meaningful description above**
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a 
newsfragment file, named `{pr_number}.significant.rst` or 
`{issue_number}.significant.rst`, in 
[newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add CLI support for bulk pause and resume of DAGs [airflow]

2024-03-18 Thread via GitHub


shahar1 commented on code in PR #38265:
URL: https://github.com/apache/airflow/pull/38265#discussion_r1528751298


##
airflow/cli/commands/dag_command.py:
##
@@ -214,14 +214,11 @@ def dag_unpause(args) -> None:
 @providers_configuration_loaded
 def set_is_paused(is_paused: bool, args) -> None:
 """Set is_paused for DAG by a given dag_id."""
-dag = DagModel.get_dagmodel(args.dag_id)
-
-if not dag:
-raise SystemExit(f"DAG: {args.dag_id} does not exist in 'dag' table")

Review Comment:
   Not necessary now as it's validated as part of existing `get_dags`. 
   To be on the safe side - I added a test that validates that an error is 
raised if the DAG doesn't exist.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(airflow-site) branch gh-pages updated (66818118b4 -> ec356e0e69)

2024-03-18 Thread github-bot
This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a change to branch gh-pages
in repository https://gitbox.apache.org/repos/asf/airflow-site.git


 discard 66818118b4 Rewritten history to remove past gh-pages deployments
 new ec356e0e69 Rewritten history to remove past gh-pages deployments

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (66818118b4)
\
 N -- N -- N   refs/heads/gh-pages (ec356e0e69)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 404.html   |   3 +
 announcements/index.html   |   3 +
 blocktype/index.html   |   3 +
 blocktype/testimonial/index.html   |   3 +
 blocktype/use-case/index.html  |   3 +
 blog/airflow-1.10.10/index.html|   7 +-
 blog/airflow-1.10.12/index.html|   7 +-
 blog/airflow-1.10.8-1.10.9/index.html  |   7 +-
 blog/airflow-2.2.0/index.html  |   7 +-
 blog/airflow-2.3.0/index.html  |   7 +-
 blog/airflow-2.4.0/index.html  |   7 +-
 blog/airflow-2.5.0/index.html  |   7 +-
 blog/airflow-2.6.0/index.html  |   7 +-
 blog/airflow-2.7.0/index.html  |   7 +-
 blog/airflow-2.8.0/index.html  |   7 +-
 blog/airflow-survey-2020/index.html|   7 +-
 blog/airflow-survey-2022/index.html|   7 +-
 blog/airflow-survey/index.html |   7 +-
 blog/airflow-two-point-oh-is-here/index.html   |   7 +-
 blog/airflow_summit_2021/index.html|   7 +-
 blog/airflow_summit_2022/index.html|   7 +-
 blog/announcing-new-website/index.html |   7 +-
 blog/apache-airflow-for-newcomers/index.html   |   7 +-
 .../index.html |   7 +-
 .../index.html |   7 +-
 .../index.html |   7 +-
 .../index.html |   7 +-
 blog/fab-oid-vulnerability/index.html  |   7 +-
 .../index.html |   7 +-
 blog/index.html|   3 +
 blog/introducing_setup_teardown/index.html |   7 +-
 .../index.html |   7 +-
 blog/tags/airflow-summit/index.html|   3 +
 blog/tags/community/index.html |   3 +
 blog/tags/development/index.html   |   3 +
 blog/tags/documentation/index.html |   3 +
 blog/tags/release/index.html   |   3 +
 blog/tags/rest-api/index.html  |   3 +
 blog/tags/survey/index.html|   3 +
 blog/tags/users/index.html |   3 +
 blog/tags/vulnerabilities/index.html   |   3 +
 categories/index.html  |   3 +
 code-of-conduct/index.html |   3 +
 community/index.html   |   3 +
 docs/index.html|   3 +
 ecosystem/index.html   |   3 +
 index.html |   3 +
 meetups/index.html |   3 +
 privacy-notice/index.html  |   3 +
 roadmap/index.html |   3 +
 search/index.html  |   7 +-
 sitemap.xml| 134 ++---
 survey/index.html  |   3 +
 tags/index.html|   3 +
 use-cases/adobe/index.html |   7 +-
 use-cases/adyen/index.html |   7 +-
 use-cases/big-fish-games/index.html|   7 +-
 use-cases/business_operations/index.html   |   7 +-
 use-cases/dish/index.html  |   7 +-
 use-cases/etl_analytics/index.html |   7 +-
 use-cases/experity/index.html  |   7 

Re: [PR] Fix task instances list link [airflow]

2024-03-18 Thread via GitHub


jedcunningham commented on PR #38096:
URL: https://github.com/apache/airflow/pull/38096#issuecomment-2005015981

   Thanks @Pad71, opened #38274 to fix it.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[PR] Fix filtered TI links [airflow]

2024-03-18 Thread via GitHub


jedcunningham opened a new pull request, #38274:
URL: https://github.com/apache/airflow/pull/38274

   We have links to the graph, filtered down to a specific TI. This was 
accidentally broken in #38096.
   
   See https://github.com/apache/airflow/pull/38096#issuecomment-1996734180.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Update yanked versions in providers changelogs [airflow]

2024-03-18 Thread via GitHub


Satoshi-Sh commented on PR #38262:
URL: https://github.com/apache/airflow/pull/38262#issuecomment-2004987631

   ![Screenshot from 2024-03-18 
15-57-24](https://github.com/apache/airflow/assets/73622805/46ea55b5-4317-43bf-90b1-3dff6fe48657)
   
   I was doing the d105 issue #38271 and the Doc CI test shows this error. 
   It seems one backtick is missing. It causes the error.
   
   @Taragolis 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add `frame-src` to CSP so YouTube videos are viewable [airflow-site]

2024-03-18 Thread via GitHub


potiuk merged PR #982:
URL: https://github.com/apache/airflow-site/pull/982


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(airflow-site) branch main updated: Add csp for yt videos to head. (#982)

2024-03-18 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow-site.git


The following commit(s) were added to refs/heads/main by this push:
 new 9fa9196b12 Add csp for yt videos to head. (#982)
9fa9196b12 is described below

commit 9fa9196b12737cb57c7f7503f5146bab5bebd363
Author: Michael Robinson <68482867+merobi-...@users.noreply.github.com>
AuthorDate: Mon Mar 18 16:55:09 2024 -0400

Add csp for yt videos to head. (#982)

Signed-off-by: merobi-hub 
---
 landing-pages/site/layouts/partials/hooks/head-end.html | 3 +++
 1 file changed, 3 insertions(+)

diff --git a/landing-pages/site/layouts/partials/hooks/head-end.html 
b/landing-pages/site/layouts/partials/hooks/head-end.html
index c840423105..7dbf8418d5 100644
--- a/landing-pages/site/layouts/partials/hooks/head-end.html
+++ b/landing-pages/site/layouts/partials/hooks/head-end.html
@@ -53,4 +53,7 @@
 {{ $vendorsHeader := index . "vendors~header" }}
 
 
+
+https://www.youtube.com'">
+
 {{ end }}



[PR] Add `frame-src` to CSP so YouTube videos are viewable [airflow-site]

2024-03-18 Thread via GitHub


merobi-hub opened a new pull request, #982:
URL: https://github.com/apache/airflow-site/pull/982

   The site's CSP is blocking YouTube videos currently.
   
   This adds a `meta` tag to head-end.html to allow `frame-src` directives from 
YouTube.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Migrate to connexion v3 [airflow]

2024-03-18 Thread via GitHub


potiuk commented on PR #37638:
URL: https://github.com/apache/airflow/pull/37638#issuecomment-2004907580

   As discussed on slack https://github.com/sudiptob2/airflow/pull/22 -> has 
lkely solution (assets not compiled in the CI image). 
   
   You will also need to rebase and resolve conflicts @sudiptob2 :) 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[PR] Create AWS auth manager documentation. Part: setup identity center [airflow]

2024-03-18 Thread via GitHub


vincbeck opened a new pull request, #38273:
URL: https://github.com/apache/airflow/pull/38273

   This PR adds documentation for the AWS auth manager. To make it easy to 
review PRs, I'll split the AWS auth manager documentation effort into multiple 
PRs. Each PR would add a section of the documentation. This PR focuses on how 
to setup AWS IAM Identity Center to work with AWS auth manager.
   
   
   
   
   
   
   
   
   ---
   **^ Add meaningful description above**
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a 
newsfragment file, named `{pr_number}.significant.rst` or 
`{issue_number}.significant.rst`, in 
[newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] fix(google,log): Avoid log name overriding [airflow]

2024-03-18 Thread via GitHub


AlexisBRENON commented on code in PR #38071:
URL: https://github.com/apache/airflow/pull/38071#discussion_r1529196456


##
airflow/providers/google/cloud/log/stackdriver_task_handler.py:
##
@@ -99,7 +99,7 @@ def __init__(
 super().__init__()
 self.gcp_key_path: str | None = gcp_key_path
 self.scopes: Collection[str] | None = scopes
-self.name: str = name
+self.gcp_log_name: str = name

Review Comment:
   I don't see any "good" use case where someone would want the current 
behavior instead of the fix.
   @shahar1 I don't change the constructor arguments (you can still get an 
instance using `StackdriverTaskHandler(name="foo", ...)`) but I change the 
meaning of the instance attribute `name` (up to now it was supposed to be the 
name of the GCP log, after this fix it will be the name of the Python handler).
   I may add some logs on access (read or write) to the `name` attribute, but I 
think it may be hard to implement cleanly and will be too verbose (as it is a 
genuine behavior of the `dictConfig` method).



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Richer Audit Log extra field [airflow]

2024-03-18 Thread via GitHub


jedcunningham commented on code in PR #38166:
URL: https://github.com/apache/airflow/pull/38166#discussion_r1529172125


##
airflow/www/decorators.py:
##
@@ -94,35 +94,54 @@ def wrapper(*args, **kwargs):
 user = get_auth_manager().get_user_name()
 user_display = get_auth_manager().get_user_display_name()
 
-fields_skip_logging = {"csrf_token", "_csrf_token", 
"is_paused"}
-extra_fields = [
-(k, secrets_masker.redact(v, k))
+isAPIRequest = request.blueprint == "/api/v1"
+hasJsonBody = request.headers.get("content-type") == 
"application/json" and request.json
+
+fields_skip_logging = {
+"csrf_token",
+"_csrf_token",
+"is_paused",
+"dag_id",
+"task_id",
+"dag_run_id",
+"run_id",
+"execution_date",
+}
+extra_fields = {
+k: secrets_masker.redact(v, k)
 for k, v in 
itertools.chain(request.values.items(multi=True), request.view_args.items())
 if k not in fields_skip_logging
-]
+}
 if event and event.startswith("variable."):
-extra_fields = _mask_variable_fields(extra_fields)
-if event and event.startswith("connection."):
-extra_fields = _mask_connection_fields(extra_fields)
+extra_fields = _mask_variable_fields(
+request.json if isAPIRequest and hasJsonBody else 
extra_fields
+)
+elif event and event.startswith("connection."):
+extra_fields = _mask_connection_fields(
+request.json if isAPIRequest and hasJsonBody else 
extra_fields
+)
+elif hasJsonBody:
+extra_fields["body"] = request.json

Review Comment:
   Can you add a test case for this adding the body?



##
airflow/www/decorators.py:
##
@@ -94,35 +94,54 @@ def wrapper(*args, **kwargs):
 user = get_auth_manager().get_user_name()
 user_display = get_auth_manager().get_user_display_name()
 
-fields_skip_logging = {"csrf_token", "_csrf_token", 
"is_paused"}
-extra_fields = [
-(k, secrets_masker.redact(v, k))
+isAPIRequest = request.blueprint == "/api/v1"
+hasJsonBody = request.headers.get("content-type") == 
"application/json" and request.json
+
+fields_skip_logging = {
+"csrf_token",
+"_csrf_token",
+"is_paused",
+"dag_id",

Review Comment:
   Can you add test coverage that these new fields are skipped?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Richer Audit Log extra field [airflow]

2024-03-18 Thread via GitHub


bbovenzi commented on code in PR #38166:
URL: https://github.com/apache/airflow/pull/38166#discussion_r1529132760


##
airflow/www/decorators.py:
##
@@ -44,36 +44,36 @@ def _mask_variable_fields(extra_fields):
 Mask the 'val_content' field if 'key_content' is in the mask list.
 
 The variable requests values and args comes in this form:
-[('key', 'key_content'),('val', 'val_content'), ('description', 
'description_content')]
+[{'key': 'key_content'},{'val': 'val_content'}, {'description': 
'description_content'}]

Review Comment:
   Yes, good catch. I forgot to update the comment after I changed it from an 
array to a single dict



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Richer Audit Log extra field [airflow]

2024-03-18 Thread via GitHub


jedcunningham commented on code in PR #38166:
URL: https://github.com/apache/airflow/pull/38166#discussion_r1529118757


##
airflow/www/decorators.py:
##
@@ -44,36 +44,36 @@ def _mask_variable_fields(extra_fields):
 Mask the 'val_content' field if 'key_content' is in the mask list.
 
 The variable requests values and args comes in this form:
-[('key', 'key_content'),('val', 'val_content'), ('description', 
'description_content')]
+[{'key': 'key_content'},{'val': 'val_content'}, {'description': 
'description_content'}]

Review Comment:
   ```suggestion
   {'key': 'key_content', 'val': 'val_content', 'description': 
'description_content'}
   ```
   
   Isn't this a single dict?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Richer Audit Log extra field [airflow]

2024-03-18 Thread via GitHub


jedcunningham commented on code in PR #38166:
URL: https://github.com/apache/airflow/pull/38166#discussion_r1529105471


##
airflow/www/decorators.py:
##
@@ -44,36 +44,36 @@ def _mask_variable_fields(extra_fields):
 Mask the 'val_content' field if 'key_content' is in the mask list.
 
 The variable requests values and args comes in this form:
-[('key', 'key_content'),('val', 'val_content'), ('description', 
'description_content')]
+[{'key': 'key_content'},{'val': 'val_content'}, {'description': 
'description_content'}]
 """
-result = []
+result = {}
 keyname = None
-for k, v in extra_fields:
+for k, v in extra_fields.items():
 if k == "key":
 keyname = v
-result.append((k, v))
-elif keyname and k == "val":
+result.update({k: v})
+elif keyname and (k == "val" or k == "value"):
 x = secrets_masker.redact(v, keyname)
-result.append((k, x))
+result.update({k: x})

Review Comment:
   ```suggestion
   result[k] = x
   ```



##
airflow/www/decorators.py:
##
@@ -44,36 +44,36 @@ def _mask_variable_fields(extra_fields):
 Mask the 'val_content' field if 'key_content' is in the mask list.
 
 The variable requests values and args comes in this form:
-[('key', 'key_content'),('val', 'val_content'), ('description', 
'description_content')]
+[{'key': 'key_content'},{'val': 'val_content'}, {'description': 
'description_content'}]
 """
-result = []
+result = {}
 keyname = None
-for k, v in extra_fields:
+for k, v in extra_fields.items():
 if k == "key":
 keyname = v
-result.append((k, v))
-elif keyname and k == "val":
+result.update({k: v})

Review Comment:
   ```suggestion
   result[k] = v
   ```



##
airflow/www/decorators.py:
##
@@ -44,36 +44,36 @@ def _mask_variable_fields(extra_fields):
 Mask the 'val_content' field if 'key_content' is in the mask list.
 
 The variable requests values and args comes in this form:
-[('key', 'key_content'),('val', 'val_content'), ('description', 
'description_content')]
+[{'key': 'key_content'},{'val': 'val_content'}, {'description': 
'description_content'}]
 """
-result = []
+result = {}
 keyname = None
-for k, v in extra_fields:
+for k, v in extra_fields.items():
 if k == "key":
 keyname = v
-result.append((k, v))
-elif keyname and k == "val":
+result.update({k: v})
+elif keyname and (k == "val" or k == "value"):
 x = secrets_masker.redact(v, keyname)
-result.append((k, x))
+result.update({k: x})
 keyname = None
 else:
-result.append((k, v))
+result.update({k: v})
 return result
 
 
 def _mask_connection_fields(extra_fields):
 """Mask connection fields."""
-result = []
-for k, v in extra_fields:
-if k == "extra":
+result = {}
+for k, v in extra_fields.items():
+if k == "extra" and v:
 try:
 extra = json.loads(v)
-extra = [(k, secrets_masker.redact(v, k)) for k, v in 
extra.items()]
-result.append((k, json.dumps(dict(extra
+extra = {k: secrets_masker.redact(v, k) for k, v in 
extra.items()}
+result.update({k: dict(extra)})
 except json.JSONDecodeError:
-result.append((k, "Encountered non-JSON in `extra` field"))
+result.update({k: "Encountered non-JSON in `extra` field"})

Review Comment:
   ```suggestion
   result[k] = "Encountered non-JSON in `extra` field"
   ```



##
airflow/www/decorators.py:
##
@@ -44,36 +44,36 @@ def _mask_variable_fields(extra_fields):
 Mask the 'val_content' field if 'key_content' is in the mask list.
 
 The variable requests values and args comes in this form:
-[('key', 'key_content'),('val', 'val_content'), ('description', 
'description_content')]
+[{'key': 'key_content'},{'val': 'val_content'}, {'description': 
'description_content'}]
 """
-result = []
+result = {}
 keyname = None
-for k, v in extra_fields:
+for k, v in extra_fields.items():
 if k == "key":
 keyname = v
-result.append((k, v))
-elif keyname and k == "val":
+result.update({k: v})
+elif keyname and (k == "val" or k == "value"):
 x = secrets_masker.redact(v, keyname)
-result.append((k, x))
+result.update({k: x})
 keyname = None
 else:
-result.append((k, v))
+result.update({k: v})
 return result
 
 
 def _mask_connection_fields(extra_fields):
 """Mask connection fields."""
-result = []
-for k, v in extra_fields:
-

(airflow-site) branch gh-pages updated (8aec6b4974 -> 66818118b4)

2024-03-18 Thread github-bot
This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a change to branch gh-pages
in repository https://gitbox.apache.org/repos/asf/airflow-site.git


 discard 8aec6b4974 Rewritten history to remove past gh-pages deployments
 new 66818118b4 Rewritten history to remove past gh-pages deployments

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (8aec6b4974)
\
 N -- N -- N   refs/heads/gh-pages (66818118b4)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 404.html   |  36 +-
 announcements/index.html   |  36 +-
 {tags => blocktype}/index.html |  48 +--
 blocktype/index.xml|  17 +
 {tags => blocktype/testimonial}/index.html |  50 +--
 {use-cases => blocktype/testimonial}/index.xml |   8 +-
 {tags => blocktype/use-case}/index.html|  50 +--
 blocktype/use-case/index.xml   | 226 
 blog/airflow-1.10.10/index.html|  40 +--
 blog/airflow-1.10.12/index.html|  40 +--
 blog/airflow-1.10.8-1.10.9/index.html  |  40 +--
 blog/airflow-2.2.0/index.html  |  40 +--
 blog/airflow-2.3.0/index.html  |  40 +--
 blog/airflow-2.4.0/index.html  |  40 +--
 blog/airflow-2.5.0/index.html  |  40 +--
 blog/airflow-2.6.0/index.html  |  40 +--
 blog/airflow-2.7.0/index.html  |  40 +--
 blog/airflow-2.8.0/index.html  |  40 +--
 blog/airflow-survey-2020/index.html|  40 +--
 blog/airflow-survey-2022/index.html|  40 +--
 blog/airflow-survey/index.html |  40 +--
 blog/airflow-two-point-oh-is-here/index.html   |  40 +--
 blog/airflow_summit_2021/index.html|  40 +--
 blog/airflow_summit_2022/index.html|  40 +--
 blog/announcing-new-website/index.html |  40 +--
 blog/apache-airflow-for-newcomers/index.html   |  40 +--
 .../index.html |  40 +--
 .../index.html |  40 +--
 .../index.html |  40 +--
 .../index.html |  40 +--
 blog/fab-oid-vulnerability/index.html  |  40 +--
 .../index.html |  40 +--
 blog/index.html|  36 +-
 blog/introducing_setup_teardown/index.html |  40 +--
 .../index.html |  40 +--
 blog/tags/airflow-summit/index.html|  36 +-
 blog/tags/community/index.html |  36 +-
 blog/tags/development/index.html   |  36 +-
 blog/tags/documentation/index.html |  36 +-
 blog/tags/release/index.html   |  36 +-
 blog/tags/rest-api/index.html  |  36 +-
 blog/tags/survey/index.html|  36 +-
 blog/tags/users/index.html |  36 +-
 blog/tags/vulnerabilities/index.html   |  36 +-
 categories/index.html  |  36 +-
 chunk-1.37486.js => chunk-1.22619.js   |   0
 chunk-3.37486.js => chunk-3.22619.js   |   0
 chunk-4.37486.js => chunk-4.22619.js   |   0
 chunk-5.37486.js => chunk-5.22619.js   |   0
 code-of-conduct/index.html |  36 +-
 community/index.html   |  36 +-
 docs.37486.js => docs.22619.js |   0
 docs/index.html|  36 +-
 ecosystem/index.html   |  36 +-
 index.html |  68 ++--
 main.37486.js => main.22619.js |   2 +-
 meetups/index.html |  36 +-
 privacy-notice/index.html  |  36 +-
 roadmap/index.html |  36 +-
 ...b77b8940a1438db9e2432286ff13f923435c9691655.css |   1 -
 

Re: [PR] Create DeleteKubernetesJobOperator and GKEDeleteJobOperator operators [airflow]

2024-03-18 Thread via GitHub


potiuk merged PR #37793:
URL: https://github.com/apache/airflow/pull/37793


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(airflow) branch main updated (884852a7b8 -> 29ac05f496)

2024-03-18 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from 884852a7b8 Add ssl context for verification of certs in FTPS hook 
(#38266)
 add 29ac05f496 Create DeleteKubernetesJobOperator and GKEDeleteJobOperator 
operators (#37793)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/cncf/kubernetes/operators/job.py |  71 +
 .../google/cloud/operators/kubernetes_engine.py| 109 +++-
 .../operators.rst  |  15 +++
 .../operators/cloud/kubernetes_engine.rst  |  21 
 .../cncf/kubernetes/operators/test_job.py  |  28 -
 .../cloud/operators/test_kubernetes_engine.py  | 114 -
 .../cncf/kubernetes/example_kubernetes_job.py  |  19 +++-
 .../example_kubernetes_engine_job.py   |  21 +++-
 8 files changed, 389 insertions(+), 9 deletions(-)



[PR] Resolve PT012 in `apache.spark`, `fab`, `ftp`, `openai` and `paermill` providers tests [airflow]

2024-03-18 Thread via GitHub


Taragolis opened a new pull request, #38272:
URL: https://github.com/apache/airflow/pull/38272

   
   
   
   
   
   
   
   ---
   **^ Add meaningful description above**
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a 
newsfragment file, named `{pr_number}.significant.rst` or 
`{issue_number}.significant.rst`, in 
[newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(airflow) branch main updated (63b58ff686 -> 884852a7b8)

2024-03-18 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from 63b58ff686 Add executors property to base job (#38093)
 add 884852a7b8 Add ssl context for verification of certs in FTPS hook 
(#38266)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/ftp/hooks/ftp.py | 6 +-
 1 file changed, 5 insertions(+), 1 deletion(-)



Re: [PR] Add ssl context for verification of certs in FTPS hook [airflow]

2024-03-18 Thread via GitHub


potiuk merged PR #38266:
URL: https://github.com/apache/airflow/pull/38266


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(airflow) branch main updated (5926b3bb81 -> 63b58ff686)

2024-03-18 Thread onikolas
This is an automated email from the ASF dual-hosted git repository.

onikolas pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from 5926b3bb81 Add missing changelog record about MySQL support in ARM 
image (#38267)
 add 63b58ff686 Add executors property to base job (#38093)

No new revisions were added by this update.

Summary of changes:
 airflow/jobs/job.py |  4 
 tests/executors/test_executor_loader.py | 13 +
 tests/jobs/test_base_job.py |  6 --
 3 files changed, 21 insertions(+), 2 deletions(-)



Re: [PR] Add executors property to base job [airflow]

2024-03-18 Thread via GitHub


o-nikolas merged PR #38093:
URL: https://github.com/apache/airflow/pull/38093


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[PR] D105 Check for Metrics Module [airflow]

2024-03-18 Thread via GitHub


Satoshi-Sh opened a new pull request, #38271:
URL: https://github.com/apache/airflow/pull/38271

   ## Related Issue 
   #37523 
   ## Checks 
   - [x] airflow/metrics/protocols.py
   - [x] airflow/metrics/validators.py
   
   
   
   
   
   
   
   ---
   **^ Add meaningful description above**
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a 
newsfragment file, named `{pr_number}.significant.rst` or 
`{issue_number}.significant.rst`, in 
[newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add executors property to base job [airflow]

2024-03-18 Thread via GitHub


o-nikolas commented on code in PR #38093:
URL: https://github.com/apache/airflow/pull/38093#discussion_r1529063606


##
airflow/jobs/job.py:
##
@@ -123,6 +123,10 @@ def __init__(self, executor=None, heartrate=None, 
**kwargs):
 def executor(self):
 return ExecutorLoader.get_default_executor()
 
+@cached_property

Review Comment:
   Marking this one as resolved, haven't heard back since the justification I 
put out. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(airflow-site) branch main updated: Update use cases page (#977)

2024-03-18 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow-site.git


The following commit(s) were added to refs/heads/main by this push:
 new ba4e2e3698 Update use cases page (#977)
ba4e2e3698 is described below

commit ba4e2e369845e316412aa170ceaa84c0993b9b93
Author: Kenten Danas <37819777+kentda...@users.noreply.github.com>
AuthorDate: Mon Mar 18 11:35:12 2024 -0700

Update use cases page (#977)



-

Signed-off-by: merobi-hub 
Co-authored-by: TJaniF 
Co-authored-by: Tamara Janina Fingerlin 
<90063506+tja...@users.noreply.github.com>
Co-authored-by: merobi-hub 
---
 CONTRIBUTE.md  |  32 ++-
 .../archetypes/{use-cases.md => testimonials.md}   |   1 +
 landing-pages/site/assets/scss/_base-layout.scss   |   5 ++
 landing-pages/site/assets/scss/_case-study.scss|   6 ++
 landing-pages/site/assets/scss/_list-boxes.scss|  59 ++-
 landing-pages/site/assets/scss/_quote.scss |   5 +-
 .../scss/{_case-study.scss => _testimonial.scss}   |  22 ---
 .../scss/{_quote.scss => _usecasedescription.scss} |  11 +---
 landing-pages/site/assets/scss/main-custom.scss|   2 +
 landing-pages/site/config.toml |  11 
 .../site/content/en/use-cases/_index.html  |   4 +-
 landing-pages/site/content/en/use-cases/adobe.md   |   1 +
 landing-pages/site/content/en/use-cases/adyen.md   |   1 +
 .../site/content/en/use-cases/big-fish-games.md|   1 +
 .../content/en/use-cases/business_operations.md|  63 +
 landing-pages/site/content/en/use-cases/dish.md|   1 +
 .../site/content/en/use-cases/etl_analytics.md |  60 
 .../site/content/en/use-cases/experity.md  |   1 +
 .../en/use-cases/infrastructure-management.md  |  56 ++
 landing-pages/site/content/en/use-cases/mlops.md   |  57 +++
 .../site/content/en/use-cases/onefootball.md   |   1 +
 .../site/content/en/use-cases/plarium-krasnodar.md |   1 +
 .../site/content/en/use-cases/seniorlink.md|   1 +
 landing-pages/site/content/en/use-cases/sift.md|   1 +
 landing-pages/site/content/en/use-cases/snapp.md   |   1 +
 landing-pages/site/content/en/use-cases/suse.md|   1 +
 landing-pages/site/layouts/index.html  |   2 +-
 .../site/layouts/partials/boxes/testimonial.html   |  32 +++
 .../site/layouts/partials/boxes/use-cases.html |  32 +++
 landing-pages/site/layouts/partials/navbar.html|   7 ---
 .../case-study.html => usecasedescription.html}|  17 ++
 landing-pages/site/layouts/use-cases/content.html  |   6 +-
 landing-pages/site/layouts/use-cases/list.html |  37 ++--
 landing-pages/site/layouts/use-cases/single.html   |   8 +--
 .../site/static/usecase-logos/business-ops.png | Bin 0 -> 16502 bytes
 landing-pages/site/static/usecase-logos/etl.png| Bin 0 -> 6822 bytes
 .../usecase-logos/infrastructure-management.png| Bin 0 -> 19506 bytes
 landing-pages/site/static/usecase-logos/mlops.png  | Bin 0 -> 8960 bytes
 .../placeholder_business_ops_video.png | Bin 0 -> 427030 bytes
 .../placeholder_etl_video.png  | Bin 0 -> 402127 bytes
 .../placeholder_infra_video.png| Bin 0 -> 376016 bytes
 .../placeholder_mlops_video.png| Bin 0 -> 416527 bytes
 landing-pages/src/index.js |   6 +-
 43 files changed, 482 insertions(+), 70 deletions(-)

diff --git a/CONTRIBUTE.md b/CONTRIBUTE.md
index f895d08581..3b6c2a1f48 100644
--- a/CONTRIBUTE.md
+++ b/CONTRIBUTE.md
@@ -243,14 +243,14 @@ the markdown file using this directive:
 ![Alt text](image.png)
 ```
 
-### How to add a new case study
+### How to add a new company testimonial
 
-To add a new case study with pre-filled frontmatter, in `/landing-pages/site` run:
+To add a new company testimonials with pre-filled frontmatter, in `/landing-pages/site` run:
 ```bash
-hugo new use-cases/my-use-case.md
+hugo new testimonials/my-testimonial.md
 ```
 
-That will create a markdown file `/landing-pages/site/content//use-cases/my-use-case.md`
+That will create a markdown file `/landing-pages/site/content//testimonials/my-testimonial.md`
 with following content:
 ```
 ---
@@ -261,6 +261,7 @@ quote:
 author: "Quote's author"
 logo: "logo-name-in-static-icons-directory.svg"
 draft: true
+blocktype: testimonial
 ---
 
 # What was the problem?
@@ -272,26 +273,27 @@ text
 # What are the results?
 text
 ```
-When you finish your writing blogpost, remember to **remove `draft: true`** 
from frontmatter.
+When you finish your testimonial, remember to **remove `draft: true`** from 
frontmatter.
 
 ---
 
-To add a new case study manually, create a markdown file in `/landing-pages/site/content//use-cases/.md`.
-The filename will 

Re: [PR] Update use cases page [airflow-site]

2024-03-18 Thread via GitHub


potiuk merged PR #977:
URL: https://github.com/apache/airflow-site/pull/977


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Update use cases page [airflow-site]

2024-03-18 Thread via GitHub


potiuk commented on PR #977:
URL: https://github.com/apache/airflow-site/pull/977#issuecomment-2004658656

   Looks good now.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add missing changelog record about MySQL support in ARM image [airflow]

2024-03-18 Thread via GitHub


potiuk merged PR #38267:
URL: https://github.com/apache/airflow/pull/38267


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(airflow) branch main updated: Add missing changelog record about MySQL support in ARM image (#38267)

2024-03-18 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 5926b3bb81 Add missing changelog record about MySQL support in ARM 
image (#38267)
5926b3bb81 is described below

commit 5926b3bb81a9a7e8170244ae8e5d7d0a230758c1
Author: Andrey Anshin 
AuthorDate: Mon Mar 18 22:33:28 2024 +0400

Add missing changelog record about MySQL support in ARM image (#38267)
---
 docs/docker-stack/changelog.rst | 1 +
 1 file changed, 1 insertion(+)

diff --git a/docs/docker-stack/changelog.rst b/docs/docker-stack/changelog.rst
index a256b6dba0..b026b65b9e 100644
--- a/docs/docker-stack/changelog.rst
+++ b/docs/docker-stack/changelog.rst
@@ -129,6 +129,7 @@ Airflow 2.6
 
   * Snowflake provider installed by default
 
+  * The ARM experimental image adds support for MySQL via MariaDB client 
libraries.
 
 Airflow 2.5.1
 ~



Re: [PR] Add missing changelog record about MySQL support in ARM image [airflow]

2024-03-18 Thread via GitHub


potiuk commented on PR #38267:
URL: https://github.com/apache/airflow/pull/38267#issuecomment-2004655658

   > So my assumption that officially it was added [into the 
2.6.0](https://github.com/apache/airflow/blob/2.6.0/Dockerfile)
   
   Yes. And since ARM image was still very experimental in 2.5.3 and it could 
be we used 2.6.* Dockerfile 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[I] Fix remaining `PT012` pytest checks: `pytest-raises-with-multiple-statements` [airflow]

2024-03-18 Thread via GitHub


Taragolis opened a new issue, #38270:
URL: https://github.com/apache/airflow/issues/38270

   ### Body
   
   Follow-up task for https://github.com/apache/airflow/pull/38219
   
   We enable this rule for avoid accidentally misses some asserts into the 
`pytest.raises()` blocks
   
   ### Easy way to find all linting problem into the module
   1. Remove corresponding line to particular module into the 
[`pyproject.toml`](https://github.com/apache/airflow/blob/main/pyproject.toml#L1490)
 in [tool.ruff.lint.per-file-ignores]
   2. Run ruff via 
[pre-commit](https://github.com/apache/airflow/blob/main/contributing-docs/08_static_code_checks.rst#pre-commit-hooks)
 hook 
   3. [Run tests 
locally](https://github.com/apache/airflow/blob/main/contributing-docs/testing/unit_tests.rst#running-unit-tests)
   
   ```console
   ❯ pre-commit run ruff --all-files
   
   Run 'ruff' for extremely fast Python 
linting.Failed
   - hook id: ruff
   - exit code: 1
   
   tests/providers/amazon/aws/hooks/test_base_aws.py:1126:5: PT012 
`pytest.raises()` block should contain a single simple statement
   tests/providers/amazon/aws/hooks/test_datasync.py:410:9: PT012 
`pytest.raises()` block should contain a single simple statement
   tests/providers/amazon/aws/hooks/test_eks.py:792:13: PT012 `pytest.raises()` 
block should contain a single simple statement
   tests/providers/amazon/aws/hooks/test_redshift_data.py:80:9: PT012 
`pytest.raises()` block should contain a single simple statement
   tests/providers/amazon/aws/hooks/test_s3.py:495:9: PT012 `pytest.raises()` 
block should contain a single simple statement
   tests/providers/amazon/aws/operators/test_emr_serverless.py:435:9: PT012 
`pytest.raises()` block should contain a single simple statement
   tests/providers/amazon/aws/operators/test_redshift_data.py:300:9: PT012 
`pytest.raises()` block should contain a single simple statement
   Found 7 errors.
   tests/providers/amazon/aws/sensors/test_glacier.py:91:9: PT012 
`pytest.raises()` block should contain a single simple statement
   tests/providers/amazon/aws/sensors/test_glue.py:129:9: PT012 
`pytest.raises()` block should contain a single simple statement
   tests/providers/amazon/aws/sensors/test_lambda_function.py:100:9: PT012 
`pytest.raises()` block should contain a single simple statement
   tests/providers/amazon/aws/system/utils/test_helpers.py:88:9: PT012 
`pytest.raises()` block should contain a single simple statement
   tests/providers/amazon/aws/system/utils/test_helpers.py:125:13: PT012 
`pytest.raises()` block should contain a single simple statement
   tests/providers/amazon/aws/transfers/test_redshift_to_s3.py:377:9: PT012 
`pytest.raises()` block should contain a single simple statement
   tests/providers/amazon/aws/triggers/test_ecs.py:55:9: PT012 
`pytest.raises()` block should contain a single simple statement
   tests/providers/amazon/aws/triggers/test_ecs.py:74:9: PT012 
`pytest.raises()` block should contain a single simple statement
   tests/providers/amazon/aws/waiters/test_neptune.py:56:9: PT012 
`pytest.raises()` block should contain a single simple statement
   tests/providers/amazon/aws/waiters/test_neptune.py:77:9: PT012 
`pytest.raises()` block should contain a single simple statement
   Found 10 errors.
   ```
   
   > [!TIP]
   > Feel free to ask a suggestion/helps in slack channels `#contributors` or 
`#new-contributors`
   
   > [!NOTE]
   > There is no problem to mark specific case which cannot be easily resolved 
by add `# noqa: PT012  reason why it should be skiped from check`
   
   
   # Regular Unit Tests
   
   ## Provider 
[amazon](https://github.com/apache/airflow/tree/main/airflow/providers/amazon)
   - [ ] `tests/providers/amazon/aws/hooks/test_base_aws.py`
   - [ ] `tests/providers/amazon/aws/hooks/test_datasync.py`
   - [ ] `tests/providers/amazon/aws/hooks/test_eks.py`
   - [ ] `tests/providers/amazon/aws/hooks/test_redshift_data.py`
   - [ ] `tests/providers/amazon/aws/hooks/test_s3.py`
   - [ ] `tests/providers/amazon/aws/operators/test_emr_serverless.py`
   - [ ] `tests/providers/amazon/aws/operators/test_redshift_data.py`
   - [ ] `tests/providers/amazon/aws/sensors/test_glacier.py`
   - [ ] `tests/providers/amazon/aws/sensors/test_glue.py`
   - [ ] `tests/providers/amazon/aws/sensors/test_lambda_function.py`
   - [ ] `tests/providers/amazon/aws/system/utils/test_helpers.py`
   - [ ] `tests/providers/amazon/aws/transfers/test_redshift_to_s3.py`
   - [ ] `tests/providers/amazon/aws/triggers/test_ecs.py`
   - [ ] `tests/providers/amazon/aws/waiters/test_neptune.py`
   
   ## Provider 
[apache.beam](https://github.com/apache/airflow/tree/main/airflow/providers/apache/beam)
   - [ ] `tests/providers/apache/beam/hooks/test_beam.py`
   
   ## Provider 
[apache.hive](https://github.com/apache/airflow/tree/main/airflow/providers/apache/hive)
   - [ ] `tests/providers/apache/hive/hooks/test_hive.py`
   - [ ] 

(airflow) branch main updated: `ECSExecutor` API Retry bug fix (#38118)

2024-03-18 Thread onikolas
This is an automated email from the ASF dual-hosted git repository.

onikolas pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 111245af3f `ECSExecutor` API Retry bug fix (#38118)
111245af3f is described below

commit 111245af3fe0bb1292ea83df2dfe0e268e6b848a
Author: Syed Hussain <103602455+syeda...@users.noreply.github.com>
AuthorDate: Mon Mar 18 11:29:29 2024 -0700

`ECSExecutor` API Retry bug fix (#38118)

* Fix bug in ECS Executor regarding API retries. Add unit test
---
 .../amazon/aws/executors/ecs/ecs_executor.py   |  22 ++-
 .../amazon/aws/executors/ecs/test_ecs_executor.py  | 196 ++---
 2 files changed, 187 insertions(+), 31 deletions(-)

diff --git a/airflow/providers/amazon/aws/executors/ecs/ecs_executor.py 
b/airflow/providers/amazon/aws/executors/ecs/ecs_executor.py
index 6167ddcbf4..87b9abe435 100644
--- a/airflow/providers/amazon/aws/executors/ecs/ecs_executor.py
+++ b/airflow/providers/amazon/aws/executors/ecs/ecs_executor.py
@@ -243,20 +243,17 @@ class AwsEcsExecutor(BaseExecutor):
 task_key = self.active_workers.arn_to_key[task.task_arn]
 
 # Mark finished tasks as either a success/failure.
-if task_state == State.FAILED:
-self.fail(task_key)
+if task_state == State.FAILED or task_state == State.REMOVED:
 self.__log_container_failures(task_arn=task.task_arn)
-elif task_state == State.SUCCESS:
-self.success(task_key)
-elif task_state == State.REMOVED:
 self.__handle_failed_task(task.task_arn, task.stopped_reason)
-if task_state in (State.FAILED, State.SUCCESS):
+elif task_state == State.SUCCESS:
 self.log.debug(
 "Airflow task %s marked as %s after running on ECS Task (arn) 
%s",
 task_key,
 task_state,
 task.task_arn,
 )
+self.success(task_key)
 self.active_workers.pop_by_key(task_key)
 
 def __describe_tasks(self, task_arns):
@@ -289,7 +286,14 @@ class AwsEcsExecutor(BaseExecutor):
 )
 
 def __handle_failed_task(self, task_arn: str, reason: str):
-"""If an API failure occurs, the task is rescheduled."""
+"""
+If an API failure occurs, the task is rescheduled.
+
+This function will determine whether the task has been attempted the 
appropriate number
+of times, and determine whether the task should be marked failed or 
not. The task will
+be removed active_workers, and marked as FAILED, or set into 
pending_tasks depending on
+how many times it has been retried.
+"""
 task_key = self.active_workers.arn_to_key[task_arn]
 task_info = self.active_workers.info_by_key(task_key)
 task_cmd = task_info.cmd
@@ -305,7 +309,6 @@ class AwsEcsExecutor(BaseExecutor):
 self.__class__.MAX_RUN_TASK_ATTEMPTS,
 task_arn,
 )
-self.active_workers.increment_failure_count(task_key)
 self.pending_tasks.append(
 EcsQueuedTask(
 task_key,
@@ -322,8 +325,8 @@ class AwsEcsExecutor(BaseExecutor):
 task_key,
 failure_count,
 )
-self.active_workers.pop_by_key(task_key)
 self.fail(task_key)
+self.active_workers.pop_by_key(task_key)
 
 def attempt_task_runs(self):
 """
@@ -346,6 +349,7 @@ class AwsEcsExecutor(BaseExecutor):
 attempt_number = ecs_task.attempt_number
 _failure_reasons = []
 if timezone.utcnow() < ecs_task.next_attempt_time:
+self.pending_tasks.append(ecs_task)
 continue
 try:
 run_task_response = self._run_task(task_key, cmd, queue, 
exec_config)
diff --git a/tests/providers/amazon/aws/executors/ecs/test_ecs_executor.py 
b/tests/providers/amazon/aws/executors/ecs/test_ecs_executor.py
index c4c7c6c9c7..8762480821 100644
--- a/tests/providers/amazon/aws/executors/ecs/test_ecs_executor.py
+++ b/tests/providers/amazon/aws/executors/ecs/test_ecs_executor.py
@@ -598,6 +598,109 @@ class TestAwsEcsExecutor:
 == caplog.messages[0]
 )
 
+@mock.patch.object(ecs_executor, "calculate_next_attempt_delay", 
return_value=dt.timedelta(seconds=0))
+def test_task_retry_on_api_failure_all_tasks_fail(self, _, mock_executor, 
caplog):
+"""
+Test API failure retries.
+"""
+AwsEcsExecutor.MAX_RUN_TASK_ATTEMPTS = "2"
+airflow_keys = ["TaskInstanceKey1", "TaskInstanceKey2"]
+airflow_commands = [mock.Mock(spec=list), mock.Mock(spec=list)]
+
+mock_executor.execute_async(airflow_keys[0], airflow_commands[0])
+mock_executor.execute_async(airflow_keys[1], airflow_commands[1])
+

Re: [PR] `ECSExecutor` API Retry bug fix [airflow]

2024-03-18 Thread via GitHub


o-nikolas merged PR #38118:
URL: https://github.com/apache/airflow/pull/38118


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(airflow) branch main updated (2c1d0f8c41 -> af689b3e8b)

2024-03-18 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from 2c1d0f8c41 Fix excessive permission changing for log task handler 
(#38164)
 add af689b3e8b Fix docker changelogs discrepancy between 2.8.3 and 2.9.0 
(#38269)

No new revisions were added by this update.

Summary of changes:
 docs/docker-stack/changelog.rst | 12 ++--
 1 file changed, 2 insertions(+), 10 deletions(-)



Re: [PR] Fix docker changelogs discrepancy between 2.8.3 and 2.9.0 [airflow]

2024-03-18 Thread via GitHub


potiuk merged PR #38269:
URL: https://github.com/apache/airflow/pull/38269


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(airflow) branch main updated: Fix excessive permission changing for log task handler (#38164)

2024-03-18 Thread potiuk
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 2c1d0f8c41 Fix excessive permission changing for log task handler 
(#38164)
2c1d0f8c41 is described below

commit 2c1d0f8c4121589e92e4c0ca1665b89a2691d2d8
Author: Jarek Potiuk 
AuthorDate: Mon Mar 18 19:28:37 2024 +0100

Fix excessive permission changing for log task handler (#38164)

When log dir permission was changed by log handler, we've
implemented also changing permissions of parent folders recursively,
however it was quite a bit too much to change it for home directory
where the log folder could have been created - because in some cases
changing permissions might lead to unexpected side-effects - such
as loosing ability to login to ssh server.

Fixes: #38137
---
 airflow/utils/log/file_task_handler.py | 41 --
 tests/utils/test_log_handlers.py   | 28 ++-
 2 files changed, 21 insertions(+), 48 deletions(-)

diff --git a/airflow/utils/log/file_task_handler.py 
b/airflow/utils/log/file_task_handler.py
index 7596805bb9..f1fc64dd96 100644
--- a/airflow/utils/log/file_task_handler.py
+++ b/airflow/utils/log/file_task_handler.py
@@ -160,31 +160,6 @@ def _ensure_ti(ti: TaskInstanceKey | TaskInstance, 
session) -> TaskInstance:
 raise AirflowException(f"Could not find TaskInstance for {ti}")
 
 
-def _change_directory_permissions_up(directory: Path, folder_permissions: int):
-"""
-Change permissions of the given directory and its parents.
-
-Only attempt to change permissions for directories owned by the current 
user.
-
-:param directory: directory to change permissions of (including parents)
-:param folder_permissions: permissions to set
-"""
-if directory.stat().st_uid == os.getuid():
-if directory.stat().st_mode % 0o1000 != folder_permissions % 0o1000:
-print(f"Changing {directory} permission to {folder_permissions}")
-try:
-directory.chmod(folder_permissions)
-except PermissionError as e:
-# In some circumstances (depends on user and filesystem) we 
might not be able to
-# change the permission for the folder (when the folder was 
created by another user
-# before or when the filesystem does not allow to change 
permission). We should not
-# fail in this case but rather ignore it.
-print(f"Failed to change {directory} permission to 
{folder_permissions}: {e}")
-return
-if directory.parent != directory:
-_change_directory_permissions_up(directory.parent, 
folder_permissions)
-
-
 class FileTaskHandler(logging.Handler):
 """
 FileTaskHandler is a python log handler that handles and reads task 
instance logs.
@@ -482,7 +457,8 @@ class FileTaskHandler(logging.Handler):
 
 return logs, metadata_array
 
-def _prepare_log_folder(self, directory: Path):
+@staticmethod
+def _prepare_log_folder(directory: Path, new_folder_permissions: int):
 """
 Prepare the log folder and ensure its mode is as configured.
 
@@ -506,11 +482,9 @@ class FileTaskHandler(logging.Handler):
 sure that the same group is set as default group for both - 
impersonated user and main airflow
 user.
 """
-new_folder_permissions = int(
-conf.get("logging", "file_task_handler_new_folder_permissions", 
fallback="0o775"), 8
-)
-directory.mkdir(mode=new_folder_permissions, parents=True, 
exist_ok=True)
-_change_directory_permissions_up(directory, new_folder_permissions)
+for parent in reversed(directory.parents):
+parent.mkdir(mode=new_folder_permissions, exist_ok=True)
+directory.mkdir(mode=new_folder_permissions, exist_ok=True)
 
 def _init_file(self, ti, *, identifier: str | None = None):
 """
@@ -532,7 +506,10 @@ class FileTaskHandler(logging.Handler):
 # if this is true, we're invoked via set_context in the context of
 # setting up individual trigger logging. return trigger log path.
 full_path = self.add_triggerer_suffix(full_path=full_path, 
job_id=ti.triggerer_job.id)
-self._prepare_log_folder(Path(full_path).parent)
+new_folder_permissions = int(
+conf.get("logging", "file_task_handler_new_folder_permissions", 
fallback="0o775"), 8
+)
+self._prepare_log_folder(Path(full_path).parent, 
new_folder_permissions)
 
 if not os.path.exists(full_path):
 open(full_path, "a").close()
diff --git a/tests/utils/test_log_handlers.py b/tests/utils/test_log_handlers.py
index d6eea8a83e..166ee5b31f 100644
--- a/tests/utils/test_log_handlers.py
+++ 

Re: [I] Airflow changes permissions of home directory [airflow]

2024-03-18 Thread via GitHub


potiuk closed issue #38137: Airflow changes permissions of home directory
URL: https://github.com/apache/airflow/issues/38137


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Fix excessive permission changing for log task handler [airflow]

2024-03-18 Thread via GitHub


potiuk merged PR #38164:
URL: https://github.com/apache/airflow/pull/38164


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Fix docker changelogs discrepancy between 2.8.3 and 2.9.0 [airflow]

2024-03-18 Thread via GitHub


potiuk commented on PR #38269:
URL: https://github.com/apache/airflow/pull/38269#issuecomment-2004643768

   Yes. They moved. Thanks for catching it 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add average duration markline in task and dagrun duration charts. [airflow]

2024-03-18 Thread via GitHub


bbovenzi commented on PR #38214:
URL: https://github.com/apache/airflow/pull/38214#issuecomment-2004610358

   But if we do just average, this looks good. Does this also work for Run 
Duration when "Show Landing Times" is checked?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Add average duration markline in task and dagrun duration charts. [airflow]

2024-03-18 Thread via GitHub


bbovenzi commented on PR #38214:
URL: https://github.com/apache/airflow/pull/38214#issuecomment-2004607556

   Nice work!
   
   If we do median and average, let's change the tick style so we rely on more 
than just color. Also, we should make sure the colors are separate from any 
state colors like red and green.
   
   I also agree that we need a legend.
   
   Later on, we can move more of this logic to the backend and we can do more 
complex metrics. I would still like to do a historic gantt chart with a 
box-and-whisker plot of task durations.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(airflow) branch main updated: Resolve PT008: Use `return_value=` instead of patching with `lambda` (#38244)

2024-03-18 Thread taragolis
This is an automated email from the ASF dual-hosted git repository.

taragolis pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new 5a612dac4a Resolve PT008: Use `return_value=` instead of patching with 
`lambda` (#38244)
5a612dac4a is described below

commit 5a612dac4a6ebdf44d73d5d23fed5419e2519eb1
Author: Andrey Anshin 
AuthorDate: Mon Mar 18 22:08:55 2024 +0400

Resolve PT008: Use `return_value=` instead of patching with `lambda` 
(#38244)
---
 pyproject.toml |  1 -
 tests/api_connexion/endpoints/test_dag_run_endpoint.py | 15 +++
 tests/cli/commands/test_task_command.py|  4 ++--
 tests/providers/google/cloud/hooks/test_cloud_build.py | 15 +++
 4 files changed, 20 insertions(+), 15 deletions(-)

diff --git a/pyproject.toml b/pyproject.toml
index 6114cf93d5..b913274769 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -1358,7 +1358,6 @@ ignore = [
 "PT005", # Fixture returns a value, remove leading underscore
 "PT006", # Wrong type of names in @pytest.mark.parametrize
 "PT007", # Wrong type of values in @pytest.mark.parametrize
-"PT008", # Use return_value= instead of patching with lambda
 "PT011", # pytest.raises() is too broad, set the match parameter
 "PT018", # assertion should be broken down into multiple parts
 "PT019", # fixture without value is injected as parameter, use 
@pytest.mark.usefixtures instead
diff --git a/tests/api_connexion/endpoints/test_dag_run_endpoint.py 
b/tests/api_connexion/endpoints/test_dag_run_endpoint.py
index 03b9e3a5b9..ef91ef78eb 100644
--- a/tests/api_connexion/endpoints/test_dag_run_endpoint.py
+++ b/tests/api_connexion/endpoints/test_dag_run_endpoint.py
@@ -1157,6 +1157,7 @@ class TestGetDagRunBatchDateFilters(TestDagRunEndpoint):
 
 
 class TestPostDagRun(TestDagRunEndpoint):
+@time_machine.travel(timezone.utcnow(), tick=False)
 @pytest.mark.parametrize("logical_date_field_name", ["execution_date", 
"logical_date"])
 @pytest.mark.parametrize(
 "dag_run_id, logical_date, note, data_interval_start, 
data_interval_end",
@@ -1188,8 +1189,7 @@ class TestPostDagRun(TestDagRunEndpoint):
 ):
 self._create_dag("TEST_DAG_ID")
 
-# We'll patch airflow.utils.timezone.utcnow to always return this so we
-# can check the returned dates.
+# We freeze time for this test, so we could check it into the returned 
dates.
 fixed_now = timezone.utcnow()
 
 # raise NotImplementedError("TODO: Add tests for data_interval_start 
and data_interval_end")
@@ -1205,12 +1205,11 @@ class TestPostDagRun(TestDagRunEndpoint):
 request_json["data_interval_end"] = data_interval_end
 
 request_json["note"] = note
-with mock.patch("airflow.utils.timezone.utcnow", lambda: fixed_now):
-response = self.client.post(
-"api/v1/dags/TEST_DAG_ID/dagRuns",
-json=request_json,
-environ_overrides={"REMOTE_USER": "test"},
-)
+response = self.client.post(
+"api/v1/dags/TEST_DAG_ID/dagRuns",
+json=request_json,
+environ_overrides={"REMOTE_USER": "test"},
+)
 
 assert response.status_code == 200
 
diff --git a/tests/cli/commands/test_task_command.py 
b/tests/cli/commands/test_task_command.py
index d345075213..056e9cddc2 100644
--- a/tests/cli/commands/test_task_command.py
+++ b/tests/cli/commands/test_task_command.py
@@ -165,12 +165,12 @@ class TestCliTasks:
 ["tasks", "test", "example_python_operator", "print_the_context", 
"2018-01-01"],
 )
 
-with mock.patch("airflow.models.TaskInstance.run", new=lambda *_, 
**__: print(password)):
+with mock.patch("airflow.models.TaskInstance.run", side_effect=lambda 
*_, **__: print(password)):
 task_command.task_test(args)
 assert capsys.readouterr().out.endswith("***\n")
 
 not_password = "!4321drowssapemos"
-with mock.patch("airflow.models.TaskInstance.run", new=lambda *_, 
**__: print(not_password)):
+with mock.patch("airflow.models.TaskInstance.run", side_effect=lambda 
*_, **__: print(not_password)):
 task_command.task_test(args)
 assert capsys.readouterr().out.endswith(f"{not_password}\n")
 
diff --git a/tests/providers/google/cloud/hooks/test_cloud_build.py 
b/tests/providers/google/cloud/hooks/test_cloud_build.py
index 7fb00f279f..69c123a210 100644
--- a/tests/providers/google/cloud/hooks/test_cloud_build.py
+++ b/tests/providers/google/cloud/hooks/test_cloud_build.py
@@ -363,15 +363,18 @@ class TestAsyncHook:
 )
 
 @pytest.mark.asyncio
-@mock.patch.object(
-CloudBuildAsyncClient, "__init__", lambda self, credentials, 
client_info, client_options: None
-)
 

Re: [PR] Resolve PT008: Use `return_value=` instead of patching with `lambda` [airflow]

2024-03-18 Thread via GitHub


Taragolis merged PR #38244:
URL: https://github.com/apache/airflow/pull/38244


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Partially enable PT012 rule [airflow]

2024-03-18 Thread via GitHub


Taragolis merged PR #38219:
URL: https://github.com/apache/airflow/pull/38219


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(airflow) branch main updated: Partially enable PT012 rule (#38219)

2024-03-18 Thread taragolis
This is an automated email from the ASF dual-hosted git repository.

taragolis pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new a0c2071834 Partially enable PT012 rule (#38219)
a0c2071834 is described below

commit a0c2071834506f2342e9ac88c65003510a5efde8
Author: Andrey Anshin 
AuthorDate: Mon Mar 18 22:03:54 2024 +0400

Partially enable PT012 rule (#38219)

* Partially enable PT012 rule

* Fixup 'tests/www/test_app.py' tests

* Fixup 
'tests/decorators/test_bash.py::TestBashDecorator::test_cwd_is_file' tests
---
 pyproject.toml | 77 +++-
 tests/cli/commands/test_task_command.py|  3 +-
 tests/cli/test_cli_parser.py   |  7 ++-
 tests/core/test_stats.py   | 13 ++--
 tests/dag_processing/test_job_runner.py| 32 +-
 tests/decorators/test_bash.py  | 48 ---
 tests/decorators/test_setup_teardown.py|  8 ++-
 tests/jobs/test_backfill_job.py|  6 +-
 tests/jobs/test_triggerer_job.py   |  2 +-
 tests/models/test_baseoperator.py  | 49 +++
 tests/models/test_dag.py   | 16 +++--
 tests/models/test_param.py |  4 +-
 tests/models/test_taskinstance.py  |  2 +-
 tests/models/test_taskmixin.py |  1 -
 tests/operators/test_python.py |  9 ++-
 tests/operators/test_subdag_operator.py|  4 +-
 tests/operators/test_trigger_dagrun.py |  7 +--
 tests/security/test_kerberos.py| 10 ++--
 tests/sensors/test_external_task_sensor.py | 32 +-
 tests/sensors/test_filesystem.py   |  9 ++-
 tests/serialization/test_serde.py  |  6 +-
 tests/utils/test_sqlalchemy.py |  4 +-
 tests/utils/test_task_group.py | 42 +++--
 tests/www/test_app.py  | 96 +++---
 24 files changed, 263 insertions(+), 224 deletions(-)

diff --git a/pyproject.toml b/pyproject.toml
index 97aceac2af..6114cf93d5 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -1360,7 +1360,6 @@ ignore = [
 "PT007", # Wrong type of values in @pytest.mark.parametrize
 "PT008", # Use return_value= instead of patching with lambda
 "PT011", # pytest.raises() is too broad, set the match parameter
-"PT012", # [controversial rule] pytest.raises() block should contain a 
single simple statement.
 "PT018", # assertion should be broken down into multiple parts
 "PT019", # fixture without value is injected as parameter, use 
@pytest.mark.usefixtures instead
 ]
@@ -1405,7 +1404,7 @@ combine-as-imports = true
 "dev/breeze/tests/*" = ["TID253", "S101"]
 "tests/*" = ["D", "TID253", "S101"]
 "docker_tests/*" = ["D", "TID253", "S101"]
-"kubernetes_tests/*" = ["D", "TID253", "S101"]
+"kubernetes_tests/*" = ["D", "TID253", "S101", "PT012"]
 "helm_tests/*" = ["D", "TID253", "S101"]
 
 # All of the modules which have an extra license header (i.e. that we copy 
from another project) need to
@@ -1417,7 +1416,7 @@ combine-as-imports = true
 "tests/providers/elasticsearch/log/elasticmock/__init__.py" = ["E402"]
 "tests/providers/elasticsearch/log/elasticmock/utilities/__init__.py" = 
["E402"]
 "tests/providers/openai/hooks/test_openai.py" = ["E402"]
-"tests/providers/openai/operators/test_openai.py" = ["E402"]
+"tests/providers/openai/operators/test_openai.py" = ["E402", "PT012"]
 "tests/providers/qdrant/hooks/test_qdrant.py" = ["E402"]
 "tests/providers/qdrant/operators/test_qdrant.py" = ["E402"]
 "tests/providers/snowflake/operators/test_snowflake_sql.py" = ["E402"]
@@ -1488,6 +1487,78 @@ combine-as-imports = true
 "airflow/providers/smtp/hooks/smtp.py" = ["D105"]
 "airflow/providers/tableau/hooks/tableau.py" = ["D105"]
 
+# All the test modules which do not follow PT012 yet
+"tests/providers/amazon/aws/hooks/test_base_aws.py" = ["PT012"]
+"tests/providers/amazon/aws/hooks/test_datasync.py" = ["PT012"]
+"tests/providers/amazon/aws/hooks/test_eks.py" = ["PT012"]
+"tests/providers/amazon/aws/hooks/test_redshift_data.py" = ["PT012"]
+"tests/providers/amazon/aws/hooks/test_s3.py" = ["PT012"]
+"tests/providers/amazon/aws/operators/test_emr_serverless.py" = ["PT012"]
+"tests/providers/amazon/aws/operators/test_redshift_data.py" = ["PT012"]
+"tests/providers/amazon/aws/sensors/test_glacier.py" = ["PT012"]
+"tests/providers/amazon/aws/sensors/test_glue.py" = ["PT012"]
+"tests/providers/amazon/aws/sensors/test_lambda_function.py" = ["PT012"]
+"tests/providers/amazon/aws/system/utils/test_helpers.py" = ["PT012"]
+"tests/providers/amazon/aws/transfers/test_redshift_to_s3.py" = ["PT012"]
+"tests/providers/amazon/aws/triggers/test_ecs.py" = ["PT012"]
+"tests/providers/amazon/aws/waiters/test_neptune.py" = ["PT012"]
+"tests/providers/apache/beam/hooks/test_beam.py" = ["PT012"]

Re: [PR] Adding max consecutive failed dag runs info in UI [airflow]

2024-03-18 Thread via GitHub


bbovenzi commented on PR #38229:
URL: https://github.com/apache/airflow/pull/38229#issuecomment-2004590920

   I would prefer we access this through the REST APU and show it in the DAG 
"Details" panel 
[here](https://github.com/apache/airflow/blob/main/airflow/www/static/js/dag/details/dag/Dag.tsx).
 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Fix `region` argument in `MappedOperator` based on `AwsBaseOperator` / `AwsBaseSensor` [airflow]

2024-03-18 Thread via GitHub


Taragolis merged PR #38178:
URL: https://github.com/apache/airflow/pull/38178


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(airflow) branch main updated (b5b972a106 -> 6029c71e2c)

2024-03-18 Thread taragolis
This is an automated email from the ASF dual-hosted git repository.

taragolis pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from b5b972a106 Update yanked versions in providers changelogs (#38262)
 add 6029c71e2c Fix `region` argument in `MappedOperator` based on 
`AwsBaseOperator` / `AwsBaseSensor` (#38178)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/amazon/aws/operators/base_aws.py |  5 ++-
 airflow/providers/amazon/aws/sensors/base_aws.py   |  5 ++-
 .../amazon/aws/operators/test_base_aws.py  | 46 
 tests/providers/amazon/aws/operators/test_ecs.py   | 49 ++
 .../providers/amazon/aws/sensors/test_base_aws.py  | 46 
 5 files changed, 149 insertions(+), 2 deletions(-)



(airflow) branch main updated: Update yanked versions in providers changelogs (#38262)

2024-03-18 Thread taragolis
This is an automated email from the ASF dual-hosted git repository.

taragolis pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
 new b5b972a106 Update yanked versions in providers changelogs (#38262)
b5b972a106 is described below

commit b5b972a1068e19b09d48ec4d7663dd1d996d594f
Author: Andrey Anshin 
AuthorDate: Mon Mar 18 21:52:12 2024 +0400

Update yanked versions in providers changelogs (#38262)
---
 airflow/providers/amazon/CHANGELOG.rst  |  6 --
 airflow/providers/cncf/kubernetes/CHANGELOG.rst |  9 +
 airflow/providers/common/io/CHANGELOG.rst   | 18 +++---
 airflow/providers/common/sql/CHANGELOG.rst  | 12 
 airflow/providers/databricks/CHANGELOG.rst  | 12 
 airflow/providers/elasticsearch/CHANGELOG.rst   | 23 +--
 airflow/providers/ftp/CHANGELOG.rst |  6 --
 airflow/providers/http/CHANGELOG.rst|  6 --
 airflow/providers/imap/CHANGELOG.rst|  6 --
 airflow/providers/postgres/CHANGELOG.rst|  6 --
 airflow/providers/sftp/CHANGELOG.rst|  6 --
 airflow/providers/smtp/CHANGELOG.rst|  6 --
 airflow/providers/snowflake/CHANGELOG.rst   | 14 --
 airflow/providers/sqlite/CHANGELOG.rst  |  6 --
 14 files changed, 89 insertions(+), 47 deletions(-)

diff --git a/airflow/providers/amazon/CHANGELOG.rst 
b/airflow/providers/amazon/CHANGELOG.rst
index fced720efb..71bf3f39a9 100644
--- a/airflow/providers/amazon/CHANGELOG.rst
+++ b/airflow/providers/amazon/CHANGELOG.rst
@@ -1705,8 +1705,10 @@ Misc
* ``Fix mypy errors in amazon aws transfer (#20590)``
* ``Update documentation for provider December 2021 release (#20523)``
 
-2.5.0
-.
+2.5.0 (YANKED)
+..
+
+.. warning:: This release has been **yanked** with a reason: ``Contains 
breaking changes``
 
 Features
 
diff --git a/airflow/providers/cncf/kubernetes/CHANGELOG.rst 
b/airflow/providers/cncf/kubernetes/CHANGELOG.rst
index c6954bbc5e..4b991b5beb 100644
--- a/airflow/providers/cncf/kubernetes/CHANGELOG.rst
+++ b/airflow/providers/cncf/kubernetes/CHANGELOG.rst
@@ -951,6 +951,8 @@ Bug Fixes
 3.1.2 (YANKED)
 ..
 
+.. warning:: This release has been **yanked** with a reason: ``Installing on 
Airflow 2.1, 2.2 allows to install unsupported kubernetes library > 11.0.0``
+
 Bug Fixes
 ~
 
@@ -965,6 +967,8 @@ Misc
 3.1.1 (YANKED)
 ..
 
+.. warning:: This release has been **yanked** with a reason: ``Installing on 
Airflow 2.1, 2.2 allows to install unsupported kubernetes library > 11.0.0``
+
 Misc
 ~
 
@@ -973,6 +977,8 @@ Misc
 3.1.0 (YANKED)
 ..
 
+.. warning:: This release has been **yanked** with a reason: ``Installing on 
Airflow 2.1, 2.2 allows to install unsupported kubernetes library > 11.0.0``
+
 Features
 
 
@@ -993,6 +999,8 @@ Misc
 3.0.2 (YANKED)
 ..
 
+.. warning:: This release has been **yanked** with a reason: ``Installing on 
Airflow 2.1, 2.2 allows to install unsupported kubernetes library > 11.0.0``
+
 Bug Fixes
 ~
 
@@ -1009,6 +1017,7 @@ Bug Fixes
 3.0.1 (YANKED)
 ..
 
+.. warning:: This release has been **yanked** with a reason: ``Installing on 
Airflow 2.1, 2.2 allows to install unsupported kubernetes library > 11.0.0``
 
 Misc
 
diff --git a/airflow/providers/common/io/CHANGELOG.rst 
b/airflow/providers/common/io/CHANGELOG.rst
index a31e94da1c..cd867f7637 100644
--- a/airflow/providers/common/io/CHANGELOG.rst
+++ b/airflow/providers/common/io/CHANGELOG.rst
@@ -62,16 +62,10 @@ Features
* ``Use reproducible builds for provider packages (#35693)``
* ``Fix and reapply templates for provider documentation (#35686)``
 
-1.0.1
-.
-
-Breaking changes
-
-
-
-Features
-
+1.0.1 (YANKED)
+..
 
+.. warning:: This release has been **yanked** with a reason: ``Used older 
interface from 2.8.0.dev0 versions``
 
 Bug Fixes
 ~
@@ -82,7 +76,9 @@ Bug Fixes
appropriate section above if needed. Do not delete the lines(!):
* ``Improvements to airflow.io (#35478)``
 
-1.0.0
-.
+1.0.0 (YANKED)
+..
+
+.. warning:: This release has been **yanked** with a reason: ``Used older 
interface from 2.8.0.dev0 versions`
 
 Initial version of the provider.
diff --git a/airflow/providers/common/sql/CHANGELOG.rst 
b/airflow/providers/common/sql/CHANGELOG.rst
index 57dac0ecc9..4f8edc6318 100644
--- a/airflow/providers/common/sql/CHANGELOG.rst
+++ b/airflow/providers/common/sql/CHANGELOG.rst
@@ -241,8 +241,10 @@ Misc
 
 * ``Bring back min-airflow-version for preinstalled providers (#31469)``
 
-1.5.0
-.
+1.5.0 (YANKED)
+..
+
+.. warning:: This release has been **yanked** with a reason: ``This version 
might cause unconstrained installation of old airflow 

Re: [PR] Update yanked versions in providers changelogs [airflow]

2024-03-18 Thread via GitHub


Taragolis merged PR #38262:
URL: https://github.com/apache/airflow/pull/38262


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] Fix dynamic allocation specs handling for custom launcher [airflow]

2024-03-18 Thread via GitHub


Taragolis merged PR #38223:
URL: https://github.com/apache/airflow/pull/38223


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(airflow) branch main updated (d8381ed250 -> d4350a6bed)

2024-03-18 Thread taragolis
This is an automated email from the ASF dual-hosted git repository.

taragolis pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from d8381ed250 Update SqlToSlackApiFileOperator with new param to check 
empty output (#38079)
 add d4350a6bed Fix dynamic allocation specs handling for custom launcher 
(#38223)

No new revisions were added by this update.

Summary of changes:
 .../kubernetes/operators/custom_object_launcher.py |   6 +-
 .../operators/test_custom_object_launcher.py   | 128 +
 2 files changed, 131 insertions(+), 3 deletions(-)



  1   2   3   >