amoghrajesh commented on PR #32487:
URL: https://github.com/apache/airflow/pull/32487#issuecomment-1643493375
@VladaZakharova I am unable to test this because I do not have the
connection ID needed for the same. My runs are just ALWAYS going to fail with
this error:
```
*** Found local files:
*** *
/root/airflow/logs/dag_id=example_gcp_dataflow_sql/run_id=manual__2023-07-20T08:19:58+00:00/task_id=create_dataset_with_location/attempt=1.log
[2023-07-20, 08:20:02 UTC] {taskinstance.py:1144} INFO - Dependencies all
met for dep_context=non-requeueable deps ti=<TaskInstance:
example_gcp_dataflow_sql.create_dataset_with_location
manual__2023-07-20T08:19:58+00:00 [queued]>
[2023-07-20, 08:20:02 UTC] {taskinstance.py:1144} INFO - Dependencies all
met for dep_context=requeueable deps ti=<TaskInstance:
example_gcp_dataflow_sql.create_dataset_with_location
manual__2023-07-20T08:19:58+00:00 [queued]>
[2023-07-20, 08:20:02 UTC] {taskinstance.py:1347} INFO - Starting attempt 1
of 1
[2023-07-20, 08:20:02 UTC] {taskinstance.py:1368} INFO - Executing
<Task(BigQueryCreateEmptyDatasetOperator): create_dataset_with_location> on
2023-07-20 08:19:58+00:00
[2023-07-20, 08:20:02 UTC] {standard_task_runner.py:57} INFO - Started
process 5801 to run task
[2023-07-20, 08:20:02 UTC] {standard_task_runner.py:84} INFO - Running:
['airflow', 'tasks', 'run', 'example_gcp_dataflow_sql',
'create_dataset_with_location', 'manual__2023-07-20T08:19:58+00:00',
'--job-id', '37', '--raw', '--subdir', 'DAGS_FOLDER/bq.py', '--cfg-path',
'/tmp/tmpberli5h9']
[2023-07-20, 08:20:02 UTC] {standard_task_runner.py:85} INFO - Job 37:
Subtask create_dataset_with_location
[2023-07-20, 08:20:02 UTC] {task_command.py:410} INFO - Running
<TaskInstance: example_gcp_dataflow_sql.create_dataset_with_location
manual__2023-07-20T08:19:58+00:00 [running]> on host 39f9d20a4c8b
[2023-07-20, 08:20:02 UTC] {taskinstance.py:1646} INFO - Exporting env vars:
AIRFLOW_CTX_DAG_OWNER='airflow' AIRFLOW_CTX_DAG_ID='example_gcp_dataflow_sql'
AIRFLOW_CTX_TASK_ID='create_dataset_with_location'
AIRFLOW_CTX_EXECUTION_DATE='2023-07-20T08:19:58+00:00'
AIRFLOW_CTX_TRY_NUMBER='1'
AIRFLOW_CTX_DAG_RUN_ID='manual__2023-07-20T08:19:58+00:00'
[2023-07-20, 08:20:02 UTC] {taskinstance.py:1925} ERROR - Task failed with
exception
Traceback (most recent call last):
File "/opt/airflow/airflow/providers/google/cloud/operators/bigquery.py",
line 1898, in execute
bq_hook = BigQueryHook(
File "/opt/airflow/airflow/providers/google/cloud/hooks/bigquery.py", line
116, in __init__
super().__init__(
File "/opt/airflow/airflow/providers/google/common/hooks/base_google.py",
line 237, in __init__
self.extras: dict = self.get_connection(self.gcp_conn_id).extra_dejson
File "/opt/airflow/airflow/hooks/base.py", line 72, in get_connection
conn = Connection.get_connection_from_secrets(conn_id)
File "/opt/airflow/airflow/models/connection.py", line 463, in
get_connection_from_secrets
raise AirflowNotFoundException(f"The conn_id `{conn_id}` isn't defined")
airflow.exceptions.AirflowNotFoundException: The conn_id
`google_cloud_default` isn't defined
[2023-07-20, 08:20:02 UTC] {taskinstance.py:1386} INFO - Marking task as
FAILED. dag_id=example_gcp_dataflow_sql, task_id=create_dataset_with_location,
execution_date=20230720T081958, start_date=20230720T082002,
end_date=20230720T082002
[2023-07-20, 08:20:02 UTC] {standard_task_runner.py:104} ERROR - Failed to
execute job 37 for task create_dataset_with_location (The conn_id
`google_cloud_default` isn't defined; 5801)
[2023-07-20, 08:20:02 UTC] {local_task_job_runner.py:225} INFO - Task exited
with return code 1
[2023-07-20, 08:20:02 UTC] {taskinstance.py:2761} INFO - 0 downstream tasks
scheduled from follow-on schedule check
```
What is the issue you are running into?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]