vincbeck commented on PR #40205: URL: https://github.com/apache/airflow/pull/40205#issuecomment-2168424139
The guilty is ... https://github.com/apache/airflow/blob/main/airflow/cli/commands/task_command.py#L470. If I comment it out, it is still failing for the same reason but one task further. The first task succeeds, and then the second one fails for the same reason. ``` [2024-06-14T16:52:57.668+0000] {dag.py:4276} INFO - created dagrun <DagRun example_sns @ 2024-06-14 16:52:35.429844+00:00: manual__2024-06-14T16:52:35.429844+00:00, state:running, queued_at: None. externally triggered: False> [2024-06-14T16:52:57.668+0000] {executor_loader.py:252} INFO - Loaded executor: LocalExecutor [2024-06-14T16:52:57.760+0000] {base_executor.py:149} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'example_sns', 'variable_fetcher', 'manual__2024-06-14T16:52:35.429844+00:00', '--force', '--local', '--pool', 'default_pool', '--subdir', '/opt/airflow/tests/system/providers/amazon/aws/example_sns.py'] [2024-06-14T16:52:57.762+0000] {local_executor.py:90} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', 'example_sns', 'variable_fetcher', 'manual__2024-06-14T16:52:35.429844+00:00', '--force', '--local', '--pool', 'default_pool', '--subdir', '/opt/airflow/tests/system/providers/amazon/aws/example_sns.py'] [2024-06-14T16:52:57.861+0000] {dagbag.py:574} INFO - Filling up the DagBag from /opt/***/tests/system/providers/amazon/aws/example_sns.py [2024-06-14T16:52:57.940+0000] {base_aws.py:164} INFO - No connection ID provided. Fallback on boto3 credential strategy (region_name='\x1b[1mNone\x1b[22m'). See: https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html [2024-06-14T16:52:57.982+0000] {credentials.py:1278} INFO - Found credentials in shared credentials file: ~/.aws/credentials [2024-06-14T16:52:58.179+0000] {__init__.py:116} INFO - SSM contains one parameter for this test, but not the requested value: 'SYSTEM_TESTS_ENV_ID' [2024-06-14T16:52:58.550+0000] {task_command.py:462} INFO - Running <TaskInstance: example_sns.variable_fetcher manual__2024-06-14T16:52:35.429844+00:00 [scheduled]> on host 013e9415b651 [2024-06-14T16:52:58.581+0000] {dag.py:2979} WARNING - No tasks to run. unrunnable tasks: {<TaskInstance: example_sns.watcher manual__2024-06-14T16:52:35.429844+00:00 [None]>, <TaskInstance: example_sns.variable_fetcher manual__2024-06-14T16:52:35.429844+00:00 [running]>, <TaskInstance: example_sns.publish_message manual__2024-06-14T16:52:35.429844+00:00 [None]>, <TaskInstance: example_sns.create_topic manual__2024-06-14T16:52:35.429844+00:00 [None]>, <TaskInstance: example_sns.delete_topic manual__2024-06-14T16:52:35.429844+00:00 [None]>} [2024-06-14T16:52:59.618+0000] {base_executor.py:149} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'example_sns', 'create_topic', 'manual__2024-06-14T16:52:35.429844+00:00', '--force', '--local', '--pool', 'default_pool', '--subdir', '/opt/airflow/tests/system/providers/amazon/aws/example_sns.py'] [2024-06-14T16:52:59.619+0000] {local_executor.py:90} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', 'example_sns', 'create_topic', 'manual__2024-06-14T16:52:35.429844+00:00', '--force', '--local', '--pool', 'default_pool', '--subdir', '/opt/airflow/tests/system/providers/amazon/aws/example_sns.py'] [2024-06-14T16:52:59.737+0000] {dagbag.py:574} INFO - Filling up the DagBag from /opt/***/tests/system/providers/amazon/aws/example_sns.py [2024-06-14T16:52:59.822+0000] {base_aws.py:164} INFO - No connection ID provided. Fallback on boto3 credential strategy (region_name='\x1b[1mNone\x1b[22m'). See: https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html [2024-06-14T16:52:59.865+0000] {credentials.py:1278} INFO - Found credentials in shared credentials file: ~/.aws/credentials [2024-06-14T16:53:00.071+0000] {__init__.py:116} INFO - SSM contains one parameter for this test, but not the requested value: 'SYSTEM_TESTS_ENV_ID' [2024-06-14T16:53:00.451+0000] {task_command.py:462} INFO - Running <TaskInstance: example_sns.create_topic manual__2024-06-14T16:52:35.429844+00:00 [scheduled]> on host 013e9415b651 [2024-06-14T16:53:00.481+0000] {dag.py:2979} WARNING - No tasks to run. unrunnable tasks: {<TaskInstance: example_sns.watcher manual__2024-06-14T16:52:35.429844+00:00 [None]>, <TaskInstance: example_sns.create_topic manual__2024-06-14T16:52:35.429844+00:00 [running]>, <TaskInstance: example_sns.delete_topic manual__2024-06-14T16:52:35.429844+00:00 [None]>, <TaskInstance: example_sns.publish_message manual__2024-06-14T16:52:35.429844+00:00 [None]>} [2024-06-14T16:53:01.515+0000] {base_executor.py:149} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'example_sns', 'publish_message', 'manual__2024-06-14T16:52:35.429844+00:00', '--force', '--local', '--pool', 'default_pool', '--subdir', '/opt/airflow/tests/system/providers/amazon/aws/example_sns.py'] [2024-06-14T16:53:01.516+0000] {local_executor.py:90} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', 'example_sns', 'publish_message', 'manual__2024-06-14T16:52:35.429844+00:00', '--force', '--local', '--pool', 'default_pool', '--subdir', '/opt/airflow/tests/system/providers/amazon/aws/example_sns.py'] [2024-06-14T16:53:01.547+0000] {cli_action_loggers.py:176} WARNING - Failed to log action (psycopg2.DatabaseError) error with status PGRES_TUPLES_OK and no message from the libpq (Background on this error at: https://sqlalche.me/e/14/4xp6) Traceback (most recent call last): File "/usr/local/bin/airflow", line 8, in <module> sys.exit(main()) File "/opt/airflow/airflow/__main__.py", line 58, in main args.func(args) File "/opt/airflow/airflow/cli/cli_config.py", line 49, in command return func(*args, **kwargs) File "/opt/airflow/airflow/utils/cli.py", line 115, in wrapper return f(*args, **kwargs) File "/opt/airflow/airflow/utils/providers_configuration_loader.py", line 55, in wrapped_function return func(*args, **kwargs) File "/opt/airflow/airflow/utils/session.py", line 84, in wrapper return func(*args, session=session, **kwargs) File "/opt/airflow/airflow/cli/commands/dag_command.py", line 611, in dag_test dr: DagRun = dag.test( File "/opt/airflow/airflow/utils/session.py", line 81, in wrapper return func(*args, **kwargs) File "/opt/airflow/airflow/models/dag.py", line 2968, in test schedulable_tis, _ = dr.update_state(session=session) File "/opt/airflow/airflow/utils/session.py", line 81, in wrapper return func(*args, **kwargs) File "/opt/airflow/airflow/models/dagrun.py", line 796, in update_state info = self.task_instance_scheduling_decisions(session) File "/opt/airflow/airflow/utils/session.py", line 81, in wrapper return func(*args, **kwargs) File "/opt/airflow/airflow/models/dagrun.py", line 931, in task_instance_scheduling_decisions tis = self.get_task_instances(session=session, state=State.task_states) File "/opt/airflow/airflow/utils/session.py", line 81, in wrapper return func(*args, **kwargs) File "/opt/airflow/airflow/models/dagrun.py", line 625, in get_task_instances return DagRun.fetch_task_instances( File "/opt/airflow/airflow/api_internal/internal_api_call.py", line 127, in wrapper return func(*args, **kwargs) File "/opt/airflow/airflow/utils/session.py", line 81, in wrapper return func(*args, **kwargs) File "/opt/airflow/airflow/models/dagrun.py", line 562, in fetch_task_instances return session.scalars(tis).all() File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 1778, in scalars return self.execute( File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 1720, in execute result = compile_state_cls.orm_setup_cursor_result( File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/context.py", line 349, in orm_setup_cursor_result return loading.instances(result, querycontext) File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/loading.py", line 88, in instances with util.safe_reraise(): File "/usr/local/lib/python3.10/site-packages/sqlalchemy/util/langhelpers.py", line 70, in __exit__ compat.raise_( File "/usr/local/lib/python3.10/site-packages/sqlalchemy/util/compat.py", line 211, in raise_ raise exception File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/loading.py", line 69, in instances *[ File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/loading.py", line 70, in <listcomp> query_entity.row_processor(context, cursor) File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/context.py", line 2631, in row_processor _instance = loading._instance_processor( File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/loading.py", line 796, in _instance_processor prop.create_row_processor( File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/interfaces.py", line 658, in create_row_processor strat.create_row_processor( File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/strategies.py", line 2533, in create_row_processor eager_adapter = self._create_eager_adapter( File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/strategies.py", line 2504, in _create_eager_adapter if self.mapper._result_has_identity_key(result, decorator): File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/mapper.py", line 2887, in _result_has_identity_key rk = result.keys() File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/result.py", line 708, in keys return self._metadata.keys File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/cursor.py", line 1225, in keys self._we_dont_return_rows() File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/cursor.py", line 1202, in _we_dont_return_rows util.raise_( File "/usr/local/lib/python3.10/site-packages/sqlalchemy/util/compat.py", line 211, in raise_ raise exception sqlalchemy.exc.ResourceClosedError: This result object does not return rows. It has been closed automatically. ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
