eladkal commented on code in PR #37998:
URL: https://github.com/apache/airflow/pull/37998#discussion_r1518051764
##
airflow/providers/cncf/kubernetes/hooks/kubernetes.py:
##
@@ -512,6 +513,36 @@ def get_job(self, job_name: str, namespace: str) -> V1Job:
"""
ret
pankajastro opened a new pull request, #38075:
URL: https://github.com/apache/airflow/pull/38075
Currently, the KPO trigger utilizes the polling_interval parameter to
check the status of a pod during its startup. However,
this approach appears incorrect and inconsistent with
the sy
cjay2067 opened a new pull request, #38076:
URL: https://github.com/apache/airflow/pull/38076
---
**^ Add meaningful description above**
Read the **[Pull Request
Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#p
boring-cyborg[bot] commented on PR #38076:
URL: https://github.com/apache/airflow/pull/38076#issuecomment-1991686122
Congratulations on your first Pull Request and welcome to the Apache Airflow
community! If you have any issues or are unsure about any anything please check
our Contributors'
cjay2067 closed pull request #38076: adding index file
URL: https://github.com/apache/airflow/pull/38076
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail:
cjay2067 commented on PR #38076:
URL: https://github.com/apache/airflow/pull/38076#issuecomment-1991689304
aa
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe
cjay2067 closed pull request #38076: adding index file
URL: https://github.com/apache/airflow/pull/38076
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail:
moiseenkov commented on code in PR #37998:
URL: https://github.com/apache/airflow/pull/37998#discussion_r1521550867
##
airflow/providers/cncf/kubernetes/hooks/kubernetes.py:
##
@@ -512,6 +514,33 @@ def get_job(self, job_name: str, namespace: str) -> V1Job:
"""
vincbeck merged PR #38044:
URL: https://github.com/apache/airflow/pull/38044
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscr...@airflow
pankajastro commented on issue #38003:
URL: https://github.com/apache/airflow/issues/38003#issuecomment-1991780598
Ok, so I did some investigation.
The POST_TERMINATION_TIMEOUT is set to 120 seconds, which means that the
pod's logs will be available for retrieval for up to 120 seconds aft
potiuk opened a new pull request, #38077:
URL: https://github.com/apache/airflow/pull/38077
GitHub actions has strange and complex way of getting conditional jobs. When
you want a job to depend on another job to be running when it is either
successful of fully skipped, you have to write a r
potiuk commented on code in PR #38077:
URL: https://github.com/apache/airflow/pull/38077#discussion_r1521587714
##
.github/workflows/ci.yml:
##
@@ -438,12 +438,7 @@ jobs:
name: "Wait for CI images"
runs-on: ["ubuntu-22.04"]
needs: [build-info, build-ci-images]
-
molcay commented on PR #37087:
URL: https://github.com/apache/airflow/pull/37087#issuecomment-1991817431
@potiuk thank you for the detailed answer. Your suggestion makes sense
totally.
I started to have a look what we can do and started to do some PoC.
Here is the summary of what
potiuk commented on PR #38077:
URL: https://github.com/apache/airflow/pull/38077#issuecomment-1991830976
> Hmm, quite interesting 🤔. Looks good to me
Indeed. I **thought** I found a clever way to workaround the limitation, but
I was wrong.
--
This is an automated message from the A
potiuk commented on PR #38077:
URL: https://github.com/apache/airflow/pull/38077#issuecomment-1991833132
I added a nicer indication now - when we build and when skip in-workflow
builds.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on t
potiuk merged PR #38077:
URL: https://github.com/apache/airflow/pull/38077
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscr...@airflow.a
potiuk commented on PR #38077:
URL: https://github.com/apache/airflow/pull/38077#issuecomment-1991858433
Uff :)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscri
vincbeck opened a new pull request, #38078:
URL: https://github.com/apache/airflow/pull/38078
In AWS Identity center there is no notion of role but notion of groups. We
used to make a translation role -> group but I think it makes it easier for
users to use the same concept across the diffe
renanxx1 commented on issue #38017:
URL: https://github.com/apache/airflow/issues/38017#issuecomment-1991916946
I had the same issue described in the post after upgrading from 2.8.1 to
2.8.2. Will it be fixed in 2.8.3 version ?
--
This is an automated message from the Apache Git Service.
potiuk commented on issue #38017:
URL: https://github.com/apache/airflow/issues/38017#issuecomment-1991931266
> I had the same issue described in the post after upgrading from 2.8.1 to
2.8.2. Will it be fixed in 2.8.3 version ?
Well. The merge happened 6 hours ago and 2.8.3 has been r
vincbeck commented on PR #37638:
URL: https://github.com/apache/airflow/pull/37638#issuecomment-1991985289
> breeze testing tests
tests/api_connexion/endpoints/test_connection_endpoint.py
It might be my environment that is wrongly set up ... When I run `breeze
testing tests tests/api
bbovenzi commented on code in PR #38021:
URL: https://github.com/apache/airflow/pull/38021#discussion_r1521730909
##
airflow/www/static/js/dag/details/taskInstance/Logs/LogBlock.tsx:
##
@@ -59,10 +67,43 @@ const LogBlock = ({ parsedLogs, wrap, tryNumber }: Props)
=> {
}
bbovenzi commented on code in PR #38021:
URL: https://github.com/apache/airflow/pull/38021#discussion_r1521734303
##
airflow/www/static/js/dag/details/taskInstance/Logs/LogBlock.tsx:
##
@@ -59,10 +67,43 @@ const LogBlock = ({ parsedLogs, wrap, tryNumber }: Props)
=> {
}
bbovenzi commented on issue #37774:
URL: https://github.com/apache/airflow/issues/37774#issuecomment-1992002839
Yes, I noticed a bug with the base_date logic. I'll work on a fix.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub
dirrao commented on code in PR #38075:
URL: https://github.com/apache/airflow/pull/38075#discussion_r1521755359
##
airflow/providers/cncf/kubernetes/triggers/pod.py:
##
@@ -92,6 +93,7 @@ def __init__(
in_cluster: bool | None = None,
get_logs: bool = True,
pankajastro commented on code in PR #38075:
URL: https://github.com/apache/airflow/pull/38075#discussion_r1521760256
##
airflow/providers/cncf/kubernetes/triggers/pod.py:
##
@@ -92,6 +93,7 @@ def __init__(
in_cluster: bool | None = None,
get_logs: bool = True,
uranusjr commented on issue #37810:
URL: https://github.com/apache/airflow/issues/37810#issuecomment-1992043317
> if task return value (==XCom) shall be taken over as `extra` event data.
So if the marker is set, the return value goes to the dataset event’s extra,
_instead of_ (not in
andyguwc opened a new pull request, #38079:
URL: https://github.com/apache/airflow/pull/38079
This PR adds a new param to the SqlToSlackApiFileOperator to optionally
check for null output. It raises an exception instead of sending a slack with
an empty file.
closes: https://github.c
dirrao commented on code in PR #38079:
URL: https://github.com/apache/airflow/pull/38079#discussion_r1521779196
##
airflow/providers/slack/transfers/sql_to_slack.py:
##
@@ -134,6 +141,8 @@ def execute(self, context: Context) -> None:
output_file_name = fp.name
a-narsudinov commented on issue #34196:
URL: https://github.com/apache/airflow/issues/34196#issuecomment-1992082189
Have the same issue during upgrade from 1.10 to 1.13:
```
Error: UPGRADE FAILED: failed to create resource: admission webhook
"validate.nginx.ingress.kubernetes.io" denie
eladkal commented on code in PR #38079:
URL: https://github.com/apache/airflow/pull/38079#discussion_r1521796637
##
airflow/providers/slack/transfers/sql_to_slack.py:
##
@@ -58,6 +62,7 @@ class SqlToSlackApiFileOperator(BaseSqlToSlackOperator):
:param slack_base_url: A stri
eladkal commented on code in PR #38079:
URL: https://github.com/apache/airflow/pull/38079#discussion_r1521796637
##
airflow/providers/slack/transfers/sql_to_slack.py:
##
@@ -58,6 +62,7 @@ class SqlToSlackApiFileOperator(BaseSqlToSlackOperator):
:param slack_base_url: A stri
dirrao commented on code in PR #37998:
URL: https://github.com/apache/airflow/pull/37998#discussion_r1521811359
##
airflow/providers/cncf/kubernetes/operators/job.py:
##
@@ -135,6 +144,18 @@ def execute(self, context: Context):
ti.xcom_push(key="job_name", value=self.jo
itsnotapt commented on issue #38003:
URL: https://github.com/apache/airflow/issues/38003#issuecomment-1992122986
That appears to do the trick. Thanks for your looking into this so quickly.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on
moiseenkov commented on code in PR #37998:
URL: https://github.com/apache/airflow/pull/37998#discussion_r1521820110
##
airflow/providers/cncf/kubernetes/operators/job.py:
##
@@ -135,6 +144,18 @@ def execute(self, context: Context):
ti.xcom_push(key="job_name", value=sel
Taragolis commented on code in PR #38079:
URL: https://github.com/apache/airflow/pull/38079#discussion_r1521824259
##
airflow/providers/slack/transfers/sql_to_slack.py:
##
@@ -33,6 +33,10 @@
from airflow.utils.context import Context
+class SqlToSlackNullOutputException(
moiseenkov commented on code in PR #37998:
URL: https://github.com/apache/airflow/pull/37998#discussion_r1521820110
##
airflow/providers/cncf/kubernetes/operators/job.py:
##
@@ -135,6 +144,18 @@ def execute(self, context: Context):
ti.xcom_push(key="job_name", value=sel
bbovenzi opened a new pull request, #38080:
URL: https://github.com/apache/airflow/pull/38080
It is not always obvious that we have active filters on the DAG page. We
should make it more obvious so users are less confused.
Clear button is disabled if no filters:
https://github.com/
potiuk commented on PR #38074:
URL: https://github.com/apache/airflow/pull/38074#issuecomment-1992151185
All right - hopefully this time it will get Green
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above t
pankajastro opened a new pull request, #38081:
URL: https://github.com/apache/airflow/pull/38081
Look like in Asyc KPO the termination step producing some duplicate logs.
This PR fixes it with follow, and last_log_time
---
**^ Add meaningful descripti
pankajastro commented on issue #38003:
URL: https://github.com/apache/airflow/issues/38003#issuecomment-1992161193
> When reading the logs in `trigger_reentry` when the trigger return a
"success" status, the operator read the log by calling `self.write_logs`
>
>
https://github.com/ap
vincbeck merged PR #38078:
URL: https://github.com/apache/airflow/pull/38078
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscr...@airflow
pankajastro commented on code in PR #38075:
URL: https://github.com/apache/airflow/pull/38075#discussion_r1521883087
##
airflow/providers/cncf/kubernetes/triggers/pod.py:
##
@@ -92,6 +93,7 @@ def __init__(
in_cluster: bool | None = None,
get_logs: bool = True,
tianyou-gu commented on code in PR #38079:
URL: https://github.com/apache/airflow/pull/38079#discussion_r1521885687
##
airflow/providers/slack/transfers/sql_to_slack.py:
##
@@ -33,6 +33,10 @@
from airflow.utils.context import Context
+class SqlToSlackNullOutputException
ferruzzi commented on PR #37936:
URL: https://github.com/apache/airflow/pull/37936#issuecomment-1992231889
I never said don't fix it. It's just a matter of if we call it a bugfix and
fix it now, creating a breaking change and sending users scrambling to figure
out why their queues are sudd
uranusjr commented on PR #37945:
URL: https://github.com/apache/airflow/pull/37945#issuecomment-1992260248
I like this
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To un
potiuk commented on PR #38074:
URL: https://github.com/apache/airflow/pull/38074#issuecomment-1992267182
one more issue with serde test
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specif
uranusjr commented on PR #37087:
URL: https://github.com/apache/airflow/pull/37087#issuecomment-1992269005
I like having this new field. Instead of NULL, I would just add a type
`scheduler` instead. Should this value be shown anywhere on the UI?
--
This is an automated message from the Ap
uranusjr commented on code in PR #38066:
URL: https://github.com/apache/airflow/pull/38066#discussion_r1521933004
##
airflow/settings.py:
##
@@ -273,7 +273,7 @@ def configure_orm(disable_connection_pool=False,
pool_class=None):
DEFAULT_ENGINE_ARGS = {
"postgresql": {
-
uranusjr commented on code in PR #38015:
URL: https://github.com/apache/airflow/pull/38015#discussion_r1521946748
##
airflow/decorators/base.py:
##
@@ -208,11 +208,35 @@ def __init__(
# since values for those will be provided when the task is run. Since
# we're
uranusjr commented on code in PR #38015:
URL: https://github.com/apache/airflow/pull/38015#discussion_r1521947442
##
tests/decorators/test_python.py:
##
@@ -259,6 +259,16 @@ def add_number(num: int) -> int:
add_number()
add_number("test")
+def test_fa
uranusjr commented on code in PR #38015:
URL: https://github.com/apache/airflow/pull/38015#discussion_r1521948279
##
airflow/decorators/base.py:
##
@@ -208,11 +208,35 @@ def __init__(
# since values for those will be provided when the task is run. Since
# we're
uranusjr commented on code in PR #38054:
URL: https://github.com/apache/airflow/pull/38054#discussion_r1521953578
##
airflow/models/baseoperator.py:
##
@@ -255,6 +257,7 @@ def partial(
on_retry_callback: None | TaskStateChangeCallback |
list[TaskStateChangeCallback] | ArgN
andyguwc commented on code in PR #38079:
URL: https://github.com/apache/airflow/pull/38079#discussion_r1521958643
##
airflow/providers/slack/transfers/sql_to_slack.py:
##
@@ -134,6 +141,8 @@ def execute(self, context: Context) -> None:
output_file_name = fp.name
tianyou-gu commented on code in PR #38079:
URL: https://github.com/apache/airflow/pull/38079#discussion_r1521885687
##
airflow/providers/slack/transfers/sql_to_slack.py:
##
@@ -33,6 +33,10 @@
from airflow.utils.context import Context
+class SqlToSlackNullOutputException
uranusjr commented on code in PR #38054:
URL: https://github.com/apache/airflow/pull/38054#discussion_r1521962702
##
airflow/models/baseoperator.py:
##
@@ -719,6 +724,8 @@ class derived from this one results in the creation of a
task object,
"on_skipped_callback",
uranusjr commented on code in PR #38054:
URL: https://github.com/apache/airflow/pull/38054#discussion_r1521963038
##
airflow/models/baseoperator.py:
##
@@ -858,6 +866,12 @@ def __init__(
if end_date:
self.end_date = timezone.convert_to_utc(end_date)
+
uranusjr commented on code in PR #38054:
URL: https://github.com/apache/airflow/pull/38054#discussion_r1521964319
##
airflow/models/taskinstance.py:
##
@@ -1252,6 +1254,8 @@ class TaskInstance(Base, LoggingMixin):
queued_dttm = Column(UtcDateTime)
queued_by_job_id = Co
uranusjr commented on PR #37778:
URL: https://github.com/apache/airflow/pull/37778#issuecomment-1992322204
A simple example on how this can be used (that cannot be covered otherwise)
would be useful.
--
This is an automated message from the Apache Git Service.
To respond to the message, p
bbovenzi opened a new pull request, #38084:
URL: https://github.com/apache/airflow/pull/38084
Before:
https://github.com/apache/airflow/assets/4600967/8f2e9f57-cd45-4f03-9eec-df134c924fbc";>
After:
https://github.com/apache/airflow/assets/4600967/3df5bbe4-69f6-42a2-8a3a-249d18890
bbovenzi closed pull request #37945: Move operator color back to a badge vs
node background
URL: https://github.com/apache/airflow/pull/37945
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the spec
bbovenzi commented on PR #37945:
URL: https://github.com/apache/airflow/pull/37945#issuecomment-1992326541
Closing in favor of https://github.com/apache/airflow/pull/38084
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and us
uranusjr commented on code in PR #37990:
URL: https://github.com/apache/airflow/pull/37990#discussion_r1521981646
##
airflow/utils/dag_parameters_overflow.py:
##
@@ -0,0 +1,50 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreeme
uranusjr commented on code in PR #37990:
URL: https://github.com/apache/airflow/pull/37990#discussion_r1521982050
##
airflow/utils/dag_parameters_overflow.py:
##
@@ -0,0 +1,50 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreeme
potiuk opened a new pull request, #38085:
URL: https://github.com/apache/airflow/pull/38085
---
**^ Add meaningful description above**
Read the **[Pull Request
Guidelines](https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst#pul
potiuk merged PR #38085:
URL: https://github.com/apache/airflow/pull/38085
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscr...@airflow.a
potiuk opened a new pull request, #980:
URL: https://github.com/apache/airflow-site/pull/980
(no comment)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-
potiuk merged PR #980:
URL: https://github.com/apache/airflow-site/pull/980
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscr...@airflow.
potiuk merged PR #38074:
URL: https://github.com/apache/airflow/pull/38074
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscr...@airflow.a
jscheffl commented on code in PR #38021:
URL: https://github.com/apache/airflow/pull/38021#discussion_r1522049211
##
airflow/www/static/js/dag/details/taskInstance/Logs/LogBlock.tsx:
##
@@ -59,10 +67,43 @@ const LogBlock = ({ parsedLogs, wrap, tryNumber }: Props)
=> {
}
dstandish commented on PR #37855:
URL: https://github.com/apache/airflow/pull/37855#issuecomment-1992497027
@potiuk i have created a discussion item for this
https://github.com/apache/airflow/discussions/38087
--
This is an automated message from the Apache Git Service.
To respond to the
uranusjr commented on code in PR #37855:
URL: https://github.com/apache/airflow/pull/37855#discussion_r1522058143
##
airflow/serialization/serialized_objects.py:
##
@@ -528,18 +546,11 @@ def serialize(
def _pydantic_model_dump(model_cls: type[BaseModel], var: Any) -
uranusjr merged PR #38084:
URL: https://github.com/apache/airflow/pull/38084
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscr...@airflow
shahar1 commented on code in PR #38049:
URL: https://github.com/apache/airflow/pull/38049#discussion_r1522083849
##
airflow/providers/google/cloud/operators/vertex_ai/auto_ml.py:
##
@@ -607,7 +609,7 @@ class
DeleteAutoMLTrainingJobOperator(GoogleCloudBaseOperator):
AutoMLT
uranusjr opened a new pull request, #38088:
URL: https://github.com/apache/airflow/pull/38088
For https://github.com/python/mypy/pull/13400
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the spe
hussein-awala commented on PR #36029:
URL: https://github.com/apache/airflow/pull/36029#issuecomment-1992548233
@jscheffl
The new column is needed to serialize the class instance, to make it
possible to create a parameterized strategy, and without this serialization, we
cannot re-create
uranusjr commented on PR #38088:
URL: https://github.com/apache/airflow/pull/38088#issuecomment-1992619555
Uh, turns out 1.9 does not fix our specific case. Trying to reproduce for
Mypy, but I’ll close this since it is not helpful for what I want.
--
This is an automated message from the
uranusjr closed pull request #38088: Upgrade Mypy to 1.9.0
URL: https://github.com/apache/airflow/pull/38088
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-m
uranusjr opened a new pull request, #38089:
URL: https://github.com/apache/airflow/pull/38089
I went back and forth a bit on this, but ultimately decided it is the most
simple to just skip the entire BaseSerialization process and serialize on our
own. We don't need to deserialize dataset_ex
RNHTTR opened a new pull request, #38090:
URL: https://github.com/apache/airflow/pull/38090
It's difficult to find problematic TIs when attempting to debug tasks that
have dozens or hundreds of mapped TIs
---
**^ Add meaningful description above**
Rea
shahar1 commented on code in PR #38048:
URL: https://github.com/apache/airflow/pull/38048#discussion_r1522172634
##
airflow/providers/google/cloud/operators/vertex_ai/custom_job.py:
##
@@ -1355,6 +1357,26 @@ def __init__(
self.gcp_conn_id = gcp_conn_id
self.imp
potiuk opened a new pull request, #38091:
URL: https://github.com/apache/airflow/pull/38091
In case images are built in the "pull-request-target" workflow, we should
skip the builds "in-workflow" - but just "skipping" them is no good because
skip state propagates to downstream jobs - but th
shahar1 commented on code in PR #38048:
URL: https://github.com/apache/airflow/pull/38048#discussion_r1522178724
##
airflow/providers/google/cloud/operators/vertex_ai/custom_job.py:
##
@@ -1328,7 +1330,7 @@ class
DeleteCustomTrainingJobOperator(GoogleCloudBaseOperator):
shahar1 commented on code in PR #38048:
URL: https://github.com/apache/airflow/pull/38048#discussion_r1522178724
##
airflow/providers/google/cloud/operators/vertex_ai/custom_job.py:
##
@@ -1328,7 +1330,7 @@ class
DeleteCustomTrainingJobOperator(GoogleCloudBaseOperator):
shahar1 commented on PR #38048:
URL: https://github.com/apache/airflow/pull/38048#issuecomment-1992673733
@eladkal / @Taragolis ready for your review
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
dstandish commented on PR #37855:
URL: https://github.com/apache/airflow/pull/37855#issuecomment-1992691317
the one test failure is already present in main ( see
https://github.com/apache/airflow/actions/runs/8255201886/job/22581344328)
So will merge anyway
--
This is an automated
shahar1 commented on PR #38052:
URL: https://github.com/apache/airflow/pull/38052#issuecomment-1992692921
> is this change backward compatible?
Yes, added a test to validate it
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to G
jscheffl commented on code in PR #38080:
URL: https://github.com/apache/airflow/pull/38080#discussion_r1522197900
##
airflow/www/static/js/dag/nav/FilterBar.tsx:
##
@@ -152,6 +208,7 @@ const FilterBar = () => {
background="white"
variant="outline"
uranusjr commented on code in PR #37498:
URL: https://github.com/apache/airflow/pull/37498#discussion_r1522198798
##
airflow/models/baseoperator.py:
##
@@ -826,6 +827,8 @@ def __init__(
self.task_id = task_group.child_id(task_id) if task_group else task_id
if n
uranusjr commented on code in PR #37498:
URL: https://github.com/apache/airflow/pull/37498#discussion_r1522198798
##
airflow/models/baseoperator.py:
##
@@ -826,6 +827,8 @@ def __init__(
self.task_id = task_group.child_id(task_id) if task_group else task_id
if n
dstandish merged PR #37855:
URL: https://github.com/apache/airflow/pull/37855
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscr...@airflo
uranusjr commented on code in PR #37347:
URL: https://github.com/apache/airflow/pull/37347#discussion_r1522202531
##
airflow/utils/log/logging_mixin.py:
##
@@ -147,18 +147,27 @@ def supports_external_link(self) -> bool:
# We have to ignore typing errors here because Python I/
uranusjr commented on code in PR #37347:
URL: https://github.com/apache/airflow/pull/37347#discussion_r1522203004
##
airflow/utils/log/logging_mixin.py:
##
@@ -147,18 +147,27 @@ def supports_external_link(self) -> bool:
# We have to ignore typing errors here because Python I/
bbovenzi opened a new pull request, #38092:
URL: https://github.com/apache/airflow/pull/38092
Upgrade react table from v7 to v8.
Using just for AuditLog to start. Will expand it later and replace the
legacy `Table` component entirely.
This gave us better control over pagination
jscheffl commented on PR #32520:
URL: https://github.com/apache/airflow/pull/32520#issuecomment-1992708602
I'd really love to have this in - 2.9.0 release cut is across the corner -
possible to get the conflicts resolved, the final bug fixed and have this
merged?
--
This is an automated
andyguwc commented on code in PR #38079:
URL: https://github.com/apache/airflow/pull/38079#discussion_r1522216852
##
airflow/providers/slack/transfers/sql_to_slack.py:
##
@@ -134,6 +141,8 @@ def execute(self, context: Context) -> None:
output_file_name = fp.name
bbovenzi commented on PR #37988:
URL: https://github.com/apache/airflow/pull/37988#issuecomment-1992711173
> Alternatively, would it be simple to migrate at elast the two "More
Details/Rendered Template" also into React? Old Log view might stay there, at
least I am a fan of the full screen
o-nikolas opened a new pull request, #38093:
URL: https://github.com/apache/airflow/pull/38093
Adding a new property `executors` to the Job class which will contain a
reference to all initialized executors for AIP-61. Schedulers and Backfill jobs
will use the references to all executors for
uranusjr commented on code in PR #38093:
URL: https://github.com/apache/airflow/pull/38093#discussion_r1522240694
##
airflow/jobs/job.py:
##
@@ -123,6 +123,10 @@ def __init__(self, executor=None, heartrate=None,
**kwargs):
def executor(self):
return ExecutorLoader
uranusjr commented on code in PR #38094:
URL: https://github.com/apache/airflow/pull/38094#discussion_r1522245413
##
airflow/serialization/helpers.py:
##
@@ -37,7 +39,26 @@ def is_jsonable(x):
else:
return True
+max_size = conf.getint("core", "max_tem
801 - 900 of 75438 matches
Mail list logo