Prab-27 commented on PR #60043:
URL: https://github.com/apache/airflow/pull/60043#issuecomment-3760047672
@kacpermuda,
I have updated path -
```
from providers.google.tests.system.google import
DEFAULT_GCP_SYSTEM_TEST_PROJECT_ID
from providers.openlineage.tests.system.openlineage.operator import
OpenLineageTestOperator
```
And
I run
`providers/google/tests/system/google/cloud/bigquery/example_bigquery_queries.py`
this shows
(I have updated ENV Variables - SYSTEM_TESTS_ENV_ID and
SYSTEM_TESTS_GCP_PROJECT)
```
devel-common/src/tests_common/test_utils/system_tests.py:89: in test_run
assert dag_run.state == DagRunState.SUCCESS, (
E AssertionError: The system test failed, please look at the logs to find
out the underlying failed task(s)
-------------------------------------- Captured stdout setup
--------------------------------------
========================= AIRFLOW ==========================
Home of the user: /root
Airflow home /root/airflow
Skipping initializing of the DB as it was initialized already.
You can re-initialize the database by adding --with-db-init flag when
running tests.
-------------------------------------- Captured stderr setup
--------------------------------------
2026-01-16T12:08:51.130173Z [warning ] Skipping masking for a secret as
it's too short (<5 chars) [airflow._shared.secrets_masker.secrets_masker]
2026-01-16T12:08:52.306526Z [warning ] Couldn't find any OpenLineage
transport configuration; will print events to console.
[openlineage.client.client]
2026-01-16T12:08:52.307013Z [info ] OpenLineageClient will use `console`
transport [openlineage.client.client]
-------------------------------------- Captured stdout call
---------------------------------------
2026-01-16T12:08:52.470987Z [info ] Created dag run.
[airflow.models.dagrun] dagrun=<DagRun bigquery_queries @ 2026-01-16
12:08:52.430140+00:00: manual__2026-01-16T12:08:52.440311+00:00, state:running,
queued_at: None. run_type: manual>
2026-01-16T12:08:58.528772Z [info ] Task started
[airflow.api_fastapi.execution_api.routes.task_instances]
correlation_id=019bc6b5-5efe-76f3-91b8-67972b52bf92 hostname=a3f933c23504
previous_state=queued ti_id=019bc6b5-5a00-7670-852c-0f86d211e8c0
2026-01-16T12:08:58.534227Z [info ] Task instance state updated
[airflow.api_fastapi.execution_api.routes.task_instances]
correlation_id=019bc6b5-5efe-76f3-91b8-67972b52bf92 rows_affected=1
ti_id=019bc6b5-5a00-7670-852c-0f86d211e8c0
2026-01-16T12:08:58.599289Z [info ] Updating RenderedTaskInstanceFields
[airflow.api_fastapi.execution_api.routes.task_instances]
correlation_id=019bc6b5-71df-7369-ab44-b9ad2be5c8ff field_count=5
ti_id=019bc6b5-5a00-7670-852c-0f86d211e8c0
2026-01-16T12:09:10.916455Z [info ] Getting connection using
`google.auth.default()` since no explicit credentials are provided.
[airflow.providers.google.cloud.utils.credentials_provider._CredentialProvider]
2026-01-16T12:09:22.910586Z [error ] Task failed with exception [task]
Traceback (most recent call last):
File
"/opt/airflow/task-sdk/src/airflow/sdk/execution_time/task_runner.py", line
1132, in run
result = _execute_task(context=context, ti=ti, log=log)
File
"/opt/airflow/task-sdk/src/airflow/sdk/execution_time/task_runner.py", line
1539, in _execute_task
result = ctx.run(execute, context=context)
File "/opt/airflow/task-sdk/src/airflow/sdk/bases/operator.py", line 443,
in wrapper
return func(self, *args, **kwargs)
File
"/opt/airflow/providers/google/src/airflow/providers/google/cloud/operators/bigquery.py",
line 1551, in execute
dataset = bq_hook.create_empty_dataset(
File
"/opt/airflow/providers/google/src/airflow/providers/google/common/hooks/base_google.py",
line 541, in inner_wrapper
kwargs["project_id"] = kwargs["project_id"] or self.project_id
File
"/opt/airflow/providers/google/src/airflow/providers/google/common/hooks/base_google.py",
line 436, in project_id
_, project_id = self.get_credentials_and_project_id()
File
"/opt/airflow/providers/google/src/airflow/providers/google/common/hooks/base_google.py",
line 329, in get_credentials_and_project_id
credentials, project_id = get_credentials_and_project_id(
File
"/opt/airflow/providers/google/src/airflow/providers/google/cloud/utils/credentials_provider.py",
line 438, in get_credentials_and_project_id
return _CredentialProvider(*args, **kwargs).get_credentials_and_project()
File
"/opt/airflow/providers/google/src/airflow/providers/google/cloud/utils/credentials_provider.py",
line 277, in get_credentials_and_project
credentials, project_id = self._get_credentials_using_adc()
File
"/opt/airflow/providers/google/src/airflow/providers/google/cloud/utils/credentials_provider.py",
line 419, in _get_credentials_using_adc
credentials, project_id = google.auth.default(scopes=scopes)
File "/usr/python/lib/python3.10/site-packages/google/auth/_default.py",
line 752, in default
raise exceptions.DefaultCredentialsError(_CLOUD_SDK_MISSING_CREDENTIALS)
google.auth.exceptions.DefaultCredentialsError: Your default credentials
were not found. To set up Application Default Credentials, see
https://cloud.google.com/docs/authentication/external/set-up-adc for more
information.
2026-01-16T12:09:22.945313Z [error ] XCom not found
[airflow.sdk.api.client] dag_id=bigquery_queries detail={'detail': {'reason':
'not_found', 'message': "XCom with key='bigquery_dataset' map_index=-1 not
found for task 'create_dataset' in DAG run
'manual__2026-01-16T12:08:52.440311+00:00' of 'bigquery_queries'"}}
key=bigquery_dataset map_index=-1
run_id=manual__2026-01-16T12:08:52.440311+00:00 status_code=404
task_id=create_dataset
2026-01-16T12:09:22.945708Z [warning ] No XCom value found; defaulting to
None. [task] dag_id=bigquery_queries key=bigquery_dataset map_index=-1
run_id=manual__2026-01-16T12:08:52.440311+00:00 task_id=create_dataset
2026-01-16T12:09:22.956961Z [error ] XCom not found
[airflow.sdk.api.client] dag_id=bigquery_queries detail={'detail': {'reason':
'not_found', 'message': "XCom with key='bigquery_dataset' map_index=-1 not
found for task 'create_dataset' in DAG run
'manual__2026-01-16T12:08:52.440311+00:00' of 'bigquery_queries'"}}
key=bigquery_dataset map_index=-1
run_id=manual__2026-01-16T12:08:52.440311+00:00 status_code=404
task_id=create_dataset
2026-01-16T12:09:22.957336Z [warning ] No XCom value found; defaulting to
None. [task] dag_id=bigquery_queries key=bigquery_dataset map_index=-1
run_id=manual__2026-01-16T12:08:52.440311+00:00 task_id=create_dataset
2026-01-16T12:09:35.236691Z [info ] Task instance state updated
[airflow.api_fastapi.execution_api.routes.task_instances]
correlation_id=019bc6b6-00bc-75e5-b4b7-cfc0cc6f51b2 new_state=failed
rows_affected=1 ti_id=019bc6b5-5a00-7670-852c-0f86d211e8c0
2026-01-16T12:09:39.602598Z [info ] Task started
[airflow.api_fastapi.execution_api.routes.task_instances]
correlation_id=019bc6b6-021f-7075-ac14-03513a8d2ac6 hostname=a3f933c23504
previous_state=queued ti_id=019bc6b5-5a11-7170-b386-61b01b3fab5c
2026-01-16T12:09:39.605684Z [info ] Task instance state updated
[airflow.api_fastapi.execution_api.routes.task_instances]
correlation_id=019bc6b6-021f-7075-ac14-03513a8d2ac6 rows_affected=1
ti_id=019bc6b5-5a11-7170-b386-61b01b3fab5c
2026-01-16T12:09:39.674574Z [info ] Updating RenderedTaskInstanceFields
[airflow.api_fastapi.execution_api.routes.task_instances]
correlation_id=019bc6b6-1254-77b4-b563-088df6c92991 field_count=3
ti_id=019bc6b5-5a11-7170-b386-61b01b3fab5c
2026-01-16T12:09:52.816624Z [error ] Task failed with exception [task]
Traceback (most recent call last):
File
"/opt/airflow/task-sdk/src/airflow/sdk/execution_time/task_runner.py", line
1132, in run
result = _execute_task(context=context, ti=ti, log=log)
File
"/opt/airflow/task-sdk/src/airflow/sdk/execution_time/task_runner.py", line
1539, in _execute_task
result = ctx.run(execute, context=context)
File "/opt/airflow/task-sdk/src/airflow/sdk/bases/operator.py", line 443,
in wrapper
return func(self, *args, **kwargs)
File "/opt/airflow/task-sdk/src/airflow/sdk/bases/decorator.py", line 299,
in execute
return_value = super().execute(context)
File "/opt/airflow/task-sdk/src/airflow/sdk/bases/operator.py", line 443,
in wrapper
return func(self, *args, **kwargs)
File
"/opt/airflow/providers/standard/src/airflow/providers/standard/operators/python.py",
line 228, in execute
return_value = self.execute_callable()
File
"/opt/airflow/providers/standard/src/airflow/providers/standard/operators/python.py",
line 251, in execute_callable
return runner.run(*self.op_args, **self.op_kwargs)
File
"/opt/airflow/task-sdk/src/airflow/sdk/execution_time/callback_runner.py", line
97, in run
return func(*args, **kwargs)
File "/opt/airflow/devel-common/src/tests_common/test_utils/watcher.py",
line 41, in watcher
raise AirflowException("Failing task because one or more upstream tasks
failed.")
airflow.sdk.exceptions.AirflowException: Failing task because one or more
upstream tasks failed.
2026-01-16T12:10:05.351570Z [info ] Task instance state updated
[airflow.api_fastapi.execution_api.routes.task_instances]
correlation_id=019bc6b6-7687-7985-b7c3-610a65a6bde8 new_state=failed
rows_affected=1 ti_id=019bc6b5-5a11-7170-b386-61b01b3fab5c
2026-01-16T12:10:13.558009Z [info ] Task started
[airflow.api_fastapi.execution_api.routes.task_instances]
correlation_id=019bc6b6-8884-7a7e-bca8-f9f9a5921ad5 hostname=a3f933c23504
previous_state=queued ti_id=019bc6b5-5a0f-72a5-a6c6-8c308dbd64cf
2026-01-16T12:10:13.560476Z [info ] Task instance state updated
[airflow.api_fastapi.execution_api.routes.task_instances]
correlation_id=019bc6b6-8884-7a7e-bca8-f9f9a5921ad5 rows_affected=1
ti_id=019bc6b5-5a0f-72a5-a6c6-8c308dbd64cf
2026-01-16T12:10:13.594896Z [info ] Updating RenderedTaskInstanceFields
[airflow.api_fastapi.execution_api.routes.task_instances]
correlation_id=019bc6b6-96d1-7b91-9803-29fcb9d99b96 field_count=4
ti_id=019bc6b5-5a0f-72a5-a6c6-8c308dbd64cf
2026-01-16T12:10:26.642044Z [info ] Dataset id:
dataset_bigquery_queries_9i Project id: None
[airflow.task.operators.airflow.providers.google.cloud.operators.bigquery.BigQueryDeleteDatasetOperator]
2026-01-16T12:10:26.646131Z [info ] Getting connection using
`google.auth.default()` since no explicit credentials are provided.
[airflow.providers.google.cloud.utils.credentials_provider._CredentialProvider]
2026-01-16T12:10:37.868724Z [error ] Task failed with exception [task]
Traceback (most recent call last):
File
"/opt/airflow/task-sdk/src/airflow/sdk/execution_time/task_runner.py", line
1132, in run
result = _execute_task(context=context, ti=ti, log=log)
File
"/opt/airflow/task-sdk/src/airflow/sdk/execution_time/task_runner.py", line
1539, in _execute_task
result = ctx.run(execute, context=context)
File "/opt/airflow/task-sdk/src/airflow/sdk/bases/operator.py", line 443,
in wrapper
return func(self, *args, **kwargs)
File
"/opt/airflow/providers/google/src/airflow/providers/google/cloud/operators/bigquery.py",
line 1456, in execute
bq_hook.delete_dataset(
File
"/opt/airflow/providers/google/src/airflow/providers/google/common/hooks/base_google.py",
line 541, in inner_wrapper
kwargs["project_id"] = kwargs["project_id"] or self.project_id
File
"/opt/airflow/providers/google/src/airflow/providers/google/common/hooks/base_google.py",
line 436, in project_id
_, project_id = self.get_credentials_and_project_id()
File
"/opt/airflow/providers/google/src/airflow/providers/google/common/hooks/base_google.py",
line 329, in get_credentials_and_project_id
credentials, project_id = get_credentials_and_project_id(
File
"/opt/airflow/providers/google/src/airflow/providers/google/cloud/utils/credentials_provider.py",
line 438, in get_credentials_and_project_id
return _CredentialProvider(*args, **kwargs).get_credentials_and_project()
File
"/opt/airflow/providers/google/src/airflow/providers/google/cloud/utils/credentials_provider.py",
line 277, in get_credentials_and_project
credentials, project_id = self._get_credentials_using_adc()
File
"/opt/airflow/providers/google/src/airflow/providers/google/cloud/utils/credentials_provider.py",
line 419, in _get_credentials_using_adc
credentials, project_id = google.auth.default(scopes=scopes)
File "/usr/python/lib/python3.10/site-packages/google/auth/_default.py",
line 752, in default
raise exceptions.DefaultCredentialsError(_CLOUD_SDK_MISSING_CREDENTIALS)
google.auth.exceptions.DefaultCredentialsError: Your default credentials
were not found. To set up Application Default Credentials, see
https://cloud.google.com/docs/authentication/external/set-up-adc for more
information.
2026-01-16T12:10:51.018027Z [info ] Task instance state updated
[airflow.api_fastapi.execution_api.routes.task_instances]
correlation_id=019bc6b7-28c1-7a85-b749-ec258f7a27ed new_state=failed
rows_affected=1 ti_id=019bc6b5-5a0f-72a5-a6c6-8c308dbd64cf
....
....
```
I believe this is unrelated error and paths are updated correctly and I need
to update through all
Could you please share your thoughts on this ?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]