clvanbrandt commented on issue #57356:
URL: https://github.com/apache/airflow/issues/57356#issuecomment-3507015893
I did some experimentation. With a basic config similar as @geocomm-descue I
get the following error in my workers (note that in my case it was causing the
task itself to fail, tried on base docker image and python 3.10 and 3.12) :
```
2025-11-08T21:45:02.740548Z [info ] Task
execute_workload[f81a34f7-8f3d-44d9-92e3-4966169bd270] received
[celery.worker.strategy] loc=strategy.py:161
2025-11-08T21:45:02.741204Z [debug ] TaskPool: Apply <function
fast_trace_task at 0x7fdfee6aca60> (args:('execute_workload',
'f81a34f7-8f3d-44d9-92e3-4966169bd270', {'lang': 'py', 'task':
'execute_workload', 'id': 'f81a34f7-8f3d-44d9-92e3-4966169bd270', 'shadow':
None, 'eta': None, 'expires': None, 'group': None, 'group_index': None,
'retries': 0, 'timelimit': [None, None], 'root_id':
'f81a34f7-8f3d-44d9-92e3-4966169bd270', 'parent_id': None, 'argsrepr':...
kwargs:{}) [celery.pool] loc=base.py:149
2025-11-08T21:45:02.751746Z [info ]
[f81a34f7-8f3d-44d9-92e3-4966169bd270] Executing workload in Celery:
token='eyJ***' ti=TaskInstance(id=UUID('019a656e-0bb2-7623-a972-f9281a872dd8'),
dag_version_id=UUID('019a6552-bac4-78ec-981a-2af4698d3850'), task_id='my_task',
dag_id='example_dag', run_id='manual__2025-11-08T21:45:00+00:00', try_number=1,
map_index=-1, pool_slots=1, queue='default', priority_weight=1,
executor_config=None, parent_context_carrier={}, context_carrier={})
dag_rel_path=PurePosixPath('example.py')
bundle_info=BundleInfo(name='dags-folder', version=None)
log_path='dag_id=example_dag/run_id=manual__2025-11-08T21:45:00+00:00/task_id=my_task/attempt=1.log'
type='ExecuteTask' [airflow.providers.celery.executors.celery_executor_utils]
loc=celery_executor_utils.py:155
2025-11-08T21:45:02.752903Z [debug ] Connecting to execution API server
[supervisor] loc=supervisor.py:1920
server=http://airflow-api-server:8080/execution/
2025-11-08T21:45:02.759193Z [error ] Task
execute_workload[f81a34f7-8f3d-44d9-92e3-4966169bd270] raised unexpected:
RuntimeError("generator didn't yield") [celery.app.trace] loc=trace.py:267
Traceback (most recent call last):
File
"/home/airflow/.local/lib/python3.10/site-packages/celery/app/trace.py", line
453, in trace_task
R = retval = fun(*args, **kwargs)
File
"/home/airflow/.local/lib/python3.10/site-packages/celery/app/trace.py", line
736, in __protected_call__
return self.run(*args, **kwargs)
File
"/home/airflow/.local/lib/python3.10/site-packages/airflow/providers/celery/executors/celery_executor_utils.py",
line 163, in execute_workload
supervise(
File
"/home/airflow/.local/lib/python3.10/site-packages/airflow/sdk/execution_time/supervisor.py",
line 1928, in supervise
logger, log_file_descriptor = _configure_logging(log_path, client)
File
"/home/airflow/.local/lib/python3.10/site-packages/airflow/sdk/execution_time/supervisor.py",
line 1843, in _configure_logging
with _remote_logging_conn(client):
File "/usr/python/lib/python3.10/contextlib.py", line 137, in __enter__
raise RuntimeError("generator didn't yield") from None
RuntimeError: generator didn't yield
```
However if I change `remote_log_conn_id: "aws_default"` to
`remote_log_conn_id: "aws"` the errors goes away and the logs correctly end up
both in the UI and Cloudwatch
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]