keysersoza commented on issue #34013:
URL: https://github.com/apache/airflow/issues/34013#issuecomment-2150271955
We could mitigate the issue a bit by setting these configs:
```
celery:
task_publish_max_retries: 5
celery_broker_transport_options:
keysersoza commented on issue #34013:
URL: https://github.com/apache/airflow/issues/34013#issuecomment-2148596956
@potiuk I tried to disable the mini scheduler, but it did not fix the issue.
--
This is an automated message from the Apache Git Service.
To respond to the message, please
keysersoza commented on issue #34013:
URL: https://github.com/apache/airflow/issues/34013#issuecomment-2148580333
Hi @pavelpi , we are experiencing the same issue:
https://github.com/apache/airflow/issues/40054
We could not observe any resource related issue; in our case, tasks go
babaMar commented on issue #34013:
URL: https://github.com/apache/airflow/issues/34013#issuecomment-2069604724
@pavelpi what I learned so far is that tasks in this state get killed by
external processes, typically of the OS. For example, I found a DAG that would
run 5 parallel tasks and
pavelpi commented on issue #34013:
URL: https://github.com/apache/airflow/issues/34013#issuecomment-2069583249
Fails also on our side. Airflow 2.7.3.
We have 5 tasks running in parallel which all succeed. The task that is run
then reports `dependency 'Task Instance State' FAILED: Task is
kaxil commented on issue #34013:
URL: https://github.com/apache/airflow/issues/34013#issuecomment-2016216787
>Can everyone in this thread make an experiment, disable it and report back
here - at least this will give some clue on where the problems might happen.
Yes please,
potiuk commented on issue #34013:
URL: https://github.com/apache/airflow/issues/34013#issuecomment-2015639112
No. I do not see anyone working on it nor providing a solution. But the log
above suggests that this is caused by so called "mini scheduler" which is
enabled by this configuration
anderzhao commented on issue #34013:
URL: https://github.com/apache/airflow/issues/34013#issuecomment-2014304310
Also facing this issue. Any solution yet?
AIRFLOW_VERSION=2.7.2
```python
file_sensor = FileSensor(
task_id=f"file_sensor_***",
filepath="***",
elmoutassim-mahdaoui-ad commented on issue #34013:
URL: https://github.com/apache/airflow/issues/34013#issuecomment-2012298111
Also facing this issue on my side. Any solution yet?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to
soravispr commented on issue #34013:
URL: https://github.com/apache/airflow/issues/34013#issuecomment-1940396284
Also facing this issue. Any solution yet?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above
potiuk commented on issue #34013:
URL: https://github.com/apache/airflow/issues/34013#issuecomment-1851859638
Well I think the crypticity is that you posted it in completely wrong
thread. Your comment and log have nothing to do with the problem here (or so it
seems). . I suggest you open
babaMar commented on issue #34013:
URL: https://github.com/apache/airflow/issues/34013#issuecomment-1851596900
> > I have been seeing this error appearing after upgrading to 2.7.3, also
using the Celery Executor on K8s. However, I don't see that pattern when
looking at resource
potiuk commented on issue #34013:
URL: https://github.com/apache/airflow/issues/34013#issuecomment-1850303636
> I have been seeing this error appearing after upgrading to 2.7.3, also
using the Celery Executor on K8s. However, I don't see that pattern when
looking at resource consumption. I
babaMar commented on issue #34013:
URL: https://github.com/apache/airflow/issues/34013#issuecomment-1849565102
I have been seeing this error appearing after upgrading to 2.7.3, also using
the Celery Executor on K8s. However, I don't see that pattern when looking at
resource consumption. I
14 matches
Mail list logo