[
https://issues.apache.org/jira/browse/AIRFLOW-3965?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16778836#comment-16778836
]
ASF GitHub Bot commented on AIRFLOW-3965:
-----------------------------------------
lihan commented on pull request #4785: [AIRFLOW-3965] Fixing
GoogleCloudStorageToBigQueryOperator failing for jobs outside US and EU
URL: https://github.com/apache/airflow/pull/4785
Make sure you have checked _all_ steps below.
### Jira
Reported on https://issues.apache.org/jira/browse/AIRFLOW-3965
### Description
GoogleCloudStorageToBigQueryOperator class does not accept "location"
parameter and for jobs that are not "US", "EU" will fail.
To be more precise, the BigQuery job itself will still work, however the
task instance will mark as error because it isn't able to fetch the job status
(HTTP 404). Setting the location parameter will fix this problem.
### Tests
- [x] My PR adds the following unit tests __OR__ does not need testing for
this extremely good reason:
Does not need testing, as the same parameter is already used by many other
GCP connection classes.
### Commits
- [x] My commits all reference Jira issues in their subject lines, and I
have squashed multiple commits if they address the same issue. In addition, my
commits follow the guidelines from "[How to write a good git commit
message](http://chris.beams.io/posts/git-commit/)":
1. Subject is separated from body by a blank line
1. Subject is limited to 50 characters (not including Jira issue reference)
1. Subject does not end with a period
1. Subject uses the imperative mood ("add", not "adding")
1. Body wraps at 72 characters
1. Body explains "what" and "why", not "how"
### Documentation
- [x] In case of new functionality, my PR adds documentation that describes
how to use it.
- When adding new operators/hooks/sensors, the autoclass documentation
generation needs to be added.
- All the public functions and the classes in the PR contain docstrings
that explain what it does
### Code Quality
- [x] Passes `flake8`
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
> GoogleCloudStorageToBigQueryOperator has no "location" parameter
> ----------------------------------------------------------------
>
> Key: AIRFLOW-3965
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3965
> Project: Apache Airflow
> Issue Type: Bug
> Components: operators
> Affects Versions: 1.10.2
> Reporter: Lihan Li
> Assignee: Lihan Li
> Priority: Critical
> Original Estimate: 24h
> Remaining Estimate: 24h
>
> *GoogleCloudStorageToBigQueryOperator* class does not accept "location"
> parameter and for jobs that are not "US", "EU" will fail.
> To be more precise, the bigquery job itself will still work, however the task
> instance will mark as error because it isn't able to fetch the job status
> (HTTP 404). Setting the location parameter will fix this problem.
>
> See error traceback
>
> {code:java}
>
> Traceback (most recent call last): File
> "/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/contrib/hooks/bigquery_hook.py",
> line 1124, in run_with_configuration try: File
> "/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/googleapiclient/_helpers.py",
> line 130, in positional_wrapper return wrapped(*args, **kwargs) File
> "/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/googleapiclient/http.py",
> line 851, in execute raise HttpError(resp, content, uri=self.uri)
> googleapiclient.errors.HttpError: <HttpError 404 when requesting
> https://www.googleapis.com/bigquery/v2/projects/<my-project-id>/jobs/job_sf_CVX8Pa49m7u6YSaFetj62qxmM?alt=json
> returned "Not found: Job <my-project-id>:job_sf_CVX8Pa49m7u6YSaFetj62qxmM">
> During handling of the above exception, another exception occurred: Traceback
> (most recent call last): File "/Users/lihanli/.virtualenvs/nbw/bin/airflow",
> line 32, in <module> args.func(args) File
> "/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/utils/cli.py",
> line 74, in wrapper return f(*args, **kwargs) File
> "/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/bin/cli.py",
> line 651, in test ti.run(ignore_task_deps=True, ignore_ti_state=True,
> test_mode=True) File
> "/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/utils/db.py",
> line 73, in wrapper return func(*args, **kwargs) File
> "/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/models.py",
> line 1750, in run session=session) File
> "/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/utils/db.py",
> line 69, in wrapper return func(*args, **kwargs) File
> "/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/models.py",
> line 1657, in _run_raw_task result = task_copy.execute(context=context) File
> "/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/contrib/operators/gcs_to_bq.py",
> line 257, in execute ignore_unknown_values=self.ignore_unknown_values, File
> "/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/contrib/hooks/bigquery_hook.py",
> line 1096, in run_load return self.run_with_configuration(configuration)
> File
> "/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/contrib/hooks/bigquery_hook.py",
> line 1124, in run_with_configuration try: Exception: ('BigQuery job status
> check failed. Final error was: %s', 404)
> {code}
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)