[ 
https://issues.apache.org/jira/browse/AIRFLOW-5249?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Thomas Pilewicz reassigned AIRFLOW-5249:
----------------------------------------

    Assignee: Thomas Pilewicz

> BigQueryCheckOperator fails for datasets outside of 'US' region
> ---------------------------------------------------------------
>
>                 Key: AIRFLOW-5249
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-5249
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: operators
>    Affects Versions: 1.10.2
>            Reporter: Michael
>            Assignee: Thomas Pilewicz
>            Priority: Blocker
>
> When I try to use the BigQueryCheckOperator or BigQueryValueCheckOperator on 
> a dataset that is not in the 'US' location my task fails with the following 
> error
> {code:java}
> [2019-08-15 07:26:19,378] {__init__.py:1580} ERROR - BigQuery job status 
> check failed. Final error was: 404
> Traceback (most recent call last):
>   File 
> "/usr/local/lib/python3.6/site-packages/airflow/contrib/hooks/bigquery_hook.py",
>  line 1241, in run_with_configuration
>     jobId=self.running_job_id).execute()
>   File "/usr/local/lib/python3.6/site-packages/googleapiclient/_helpers.py", 
> line 130, in positional_wrapper
>     return wrapped(*args, **kwargs)
>   File "/usr/local/lib/python3.6/site-packages/googleapiclient/http.py", line 
> 855, in execute
>     raise HttpError(resp, content, uri=self.uri)
> googleapiclient.errors.HttpError: <HttpError 404 when requesting 
> https://www.googleapis.com/bigquery/v2/projects/anz-data-cde-airflow/jobs/job_ISDpiVtd7U1p-6N9wT378LfwoFHc?alt=json
>  returned "Not found: Job 
> anz-data-cde-airflow:job_ISDpiVtd7U1p-6N9wT378LfwoFHc">
> During handling of the above exception, another exception occurred:
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.6/site-packages/airflow/models/__init__.py", 
> line 1441, in _run_raw_task
>     result = task_copy.execute(context=context)
>   File 
> "/usr/local/lib/python3.6/site-packages/airflow/operators/check_operator.py", 
> line 81, in execute
>     records = self.get_db_hook().get_first(self.sql)
>   File "/usr/local/lib/python3.6/site-packages/airflow/hooks/dbapi_hook.py", 
> line 138, in get_first
>     cur.execute(sql)
>   File 
> "/usr/local/lib/python3.6/site-packages/airflow/contrib/hooks/bigquery_hook.py",
>  line 1821, in execute
>     self.job_id = self.run_query(sql)
>   File 
> "/usr/local/lib/python3.6/site-packages/airflow/contrib/hooks/bigquery_hook.py",
>  line 849, in run_query
>     return self.run_with_configuration(configuration)
>   File 
> "/usr/local/lib/python3.6/site-packages/airflow/contrib/hooks/bigquery_hook.py",
>  line 1263, in run_with_configuration
>     format(err.resp.status))
> Exception: BigQuery job status check failed. Final error was: 404
> [2019-08-15 07:26:19,388] {__init__.py:1611} INFO - Marking task as FAILED.
> {code}
> This is the same error I get when I try to run the BigQuery operator without 
> specifying a location. When I run the same operator on a dataset that is in 
> the US region It succeeds.
> The BigQueryCheckOperator does not accept a location as one of its arguments 
> and does not pass a location to the BigQueryHook, I believe this is the 
> source of the problem. 
>  
> I realise a task (AIRFLOW-3601) was already created to fix a similar issue to 
> this one, but the referenced task calls out the two operators I'm having an 
> issue with as out of scope and after commenting on that task I have not 
> received a response.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to