rasalt opened a new issue #12110:
URL: https://github.com/apache/airflow/issues/12110


   Goal:
   I Want to allow jagged rows and specify a json file with the schema 
definition for the BQ destination table.
   This works as expected if I use the "schema_fields" argument but does not 
when I use the gcs_schema_object argument.
   Please advise.
   <!--
   loadGcsToBq = GoogleCloudStorageToBigQueryOperator(
           task_id='gcstobq_jagged',
           bucket='dataengpa',
           source_objects=['deniro.csv'],
           
destination_project_dataset_table=f"{DATASET_NAME}.deniro_tmp_jagged",
           skip_leading_rows=1,
           gcs_schema_object='<gcs path>>deniro_schema_jagged.json',
           source_format='CSV',
           create_disposition='CREATE_IF_NEEDED',
           write_disposition='WRITE_TRUNCATE',
           allow_jagged_rows=True,
           )
   
   -->
   
   <!--
   
   schema_fields=[
     {
       "mode": "NULLABLE",
       "name": "year",
       "type": "INTEGER"
     },
     {
       "mode": "NULLABLE",
       "name": "score",
       "type": "INTEGER"
     },
     {
       "mode": "NULLABLE",
       "name": "title",
       "type": "STRING"
     },
     {
       "mode": "NULLABLE",
       "name": "tbd1",
       "type": "STRING"
     }, 
     {
       "mode": "NULLABLE",
       "name": "tbd2",
       "type": "STRING"
     } 
   ]
   
   -->
   <!-- DAG log
   
--------------------------------------------------------------------------------
   [2020-11-05 14:44:47,151] {taskinstance.py:901} INFO - Executing 
<Task(GoogleCloudStorageToBigQueryOperator): gcstobq_jagged> on 
2020-05-12T00:00:00+00:00
   [2020-11-05 14:44:47,152] {base_task_runner.py:131} INFO - Running on host: 
airflow-worker-5f878fcd8d-s8nb8
   [2020-11-05 14:44:47,153] {base_task_runner.py:132} INFO - Running: 
['airflow', 'run', 'load_tmp_data_jagged', 'gcstobq_jagged', 
'2020-05-12T00:00:00+00:00', '--job_id', '7389', '--pool', 'default_pool', 
'--raw', '-sd', 'DAGS_FOLDER/bqdag_gcstobqjaggedtemplate.py', '--cfg_path', 
'/tmp/tmp5y4u80pm']
   [2020-11-05 14:44:56,144] {base_task_runner.py:113} INFO - Job 7389: Subtask 
gcstobq_jagged [2020-11-05 14:44:56,143] {configuration.py:618} INFO - Reading 
the config from /etc/airflow/airflow.cfg
   [2020-11-05 14:44:56,485] {base_task_runner.py:113} INFO - Job 7389: Subtask 
gcstobq_jagged [2020-11-05 14:44:56,484] {default_celery.py:90} WARNING - You 
have configured a result_backend of 
redis://airflow-redis-service.default.svc.cluster.local:6379/0, it is highly 
recommended to use an alternative result_backend (i.e. a database).
   [2020-11-05 14:44:56,487] {base_task_runner.py:113} INFO - Job 7389: Subtask 
gcstobq_jagged [2020-11-05 14:44:56,487] {__init__.py:51} INFO - Using executor 
CeleryExecutor
   [2020-11-05 14:44:56,487] {base_task_runner.py:113} INFO - Job 7389: Subtask 
gcstobq_jagged [2020-11-05 14:44:56,487] {dagbag.py:397} INFO - Filling up the 
DagBag from /home/airflow/gcs/dags/bqdag_gcstobqjaggedtemplate.py
   [2020-11-05 14:44:56,511] {base_task_runner.py:113} INFO - Job 7389: Subtask 
gcstobq_jagged /usr/local/lib/airflow/airflow/utils/helpers.py:441: 
DeprecationWarning: Importing 'SimpleHttpOperator' directly from 
'airflow.operators' has been deprecated. Please import from 
'airflow.operators.[operator_module]' instead. Support for direct imports will 
be dropped entirely in Airflow 2.0.
   [2020-11-05 14:44:56,511] {base_task_runner.py:113} INFO - Job 7389: Subtask 
gcstobq_jagged   DeprecationWarning)
   [2020-11-05 14:44:56,868] {base_task_runner.py:113} INFO - Job 7389: Subtask 
gcstobq_jagged 
/usr/local/lib/airflow/airflow/contrib/operators/gcs_to_bq.py:183: 
PendingDeprecationWarning: Invalid arguments were passed to 
GoogleCloudStorageToBigQueryOperator (task_id: gcstobq_jagged). Support for 
passing such arguments will be dropped in Airflow 2.0. Invalid arguments were:
   [2020-11-05 14:44:56,869] {base_task_runner.py:113} INFO - Job 7389: Subtask 
gcstobq_jagged *args: ()
   [2020-11-05 14:44:56,869] {base_task_runner.py:113} INFO - Job 7389: Subtask 
gcstobq_jagged **kwargs: {'gcs_schema_object': 
'gs://dfhjshfjs/templates/deniro_schema_jagged.json'}
   [2020-11-05 14:44:56,869] {base_task_runner.py:113} INFO - Job 7389: Subtask 
gcstobq_jagged   super(GoogleCloudStorageToBigQueryOperator, 
self).__init__(*args, **kwargs)
   [2020-11-05 14:44:57,362] {base_task_runner.py:113} INFO - Job 7389: Subtask 
gcstobq_jagged Running <TaskInstance: load_tmp_data_jagged.gcstobq_jagged 
2020-05-12T00:00:00+00:00 [running]> on host airflow-worker-5f878fcd8d-s8nb8
   [2020-11-05 14:44:57,466] {base_task_runner.py:113} INFO - Job 7389: Subtask 
gcstobq_jagged [2020-11-05 14:44:57,466] {gcp_api_base_hook.py:149} INFO - 
Getting connection using `google.auth.default()` since no key file is defined 
for hook.
   [2020-11-05 14:44:57,481] {base_task_runner.py:113} INFO - Job 7389: Subtask 
gcstobq_jagged [2020-11-05 14:44:57,481] {discovery.py:280} INFO - URL being 
requested: GET https://www.googleapis.com/discovery/v1/apis/bigquery/v2/rest
   [2020-11-05 14:44:57,569] {base_task_runner.py:113} INFO - Job 7389: Subtask 
gcstobq_jagged [2020-11-05 14:44:57,569] {bigquery_hook.py:2240} INFO - Project 
not included in destination_project_dataset_table: pa.deniro_tmp_jagged; using 
project "dataeng-219316"
   [2020-11-05 14:44:57,577] {base_task_runner.py:113} INFO - Job 7389: Subtask 
gcstobq_jagged [2020-11-05 14:44:57,577] {discovery.py:911} INFO - URL being 
requested: POST 
https://bigquery.googleapis.com/bigquery/v2/projects/dataeng-219316/jobs?alt=json
   [2020-11-05 14:44:57,870] {base_task_runner.py:113} INFO - Job 7389: Subtask 
gcstobq_jagged [2020-11-05 14:44:57,870] {discovery.py:911} INFO - URL being 
requested: GET 
https://bigquery.googleapis.com/bigquery/v2/projects/dataeng-219316/jobs/job_vmvnZH7aI_0g2L__fsfNu3EuOaXJ?location=US&alt=json
   [2020-11-05 14:44:57,908] {base_task_runner.py:113} INFO - Job 7389: Subtask 
gcstobq_jagged [2020-11-05 14:44:57,908] {bigquery_hook.py:1347} INFO - Waiting 
for job to complete : dataeng-219316, job_vmvnZH7aI_0g2L__fsfNu3EuOaXJ
   [2020-11-05 14:45:02,914] {base_task_runner.py:113} INFO - Job 7389: Subtask 
gcstobq_jagged [2020-11-05 14:45:02,913] {discovery.py:911} INFO - URL being 
requested: GET 
https://bigquery.googleapis.com/bigquery/v2/projects/dataeng-219316/jobs/job_vmvnZH7aI_0g2L__fsfNu3EuOaXJ?location=US&alt=json
   [2020-11-05 14:45:02,997] {taskinstance.py:1067} INFO - Marking task as 
SUCCESS.dag_id=load_tmp_data_jagged, task_id=gcstobq_jagged, 
execution_date=20200512T000000, start_date=20201105T144447, 
end_date=20201105T144502
   [2020-11-05 14:45:02,998] {base_task_runner.py:113} INFO - Job 7389: Subtask 
gcstobq_jagged [2020-11-05 14:45:02,997] {taskinstance.py:1067} INFO - Marking 
task as SUCCESS.dag_id=load_tmp_data_jagged, task_id=gcstobq_jagged, 
execution_date=20200512T000000, start_date=20201105T144447, 
end_date=20201105T144502
   
   
   
   -->
   **Apache Airflow version**:
   
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
   
   **Environment**:
   
   - **Cloud provider or hardware configuration**: GCP
   - **OS** (e.g. from /etc/os-release):
   - **Kernel** (e.g. `uname -a`):
   - **Install tools**:
   - **Others**:
   
   **What happened**:
   
   <!-- (please include exact error messages if you can) -->
   
   **What you expected to happen**:
   
   <!-- What do you think went wrong? -->
   
   **How to reproduce it**:
   <!---
   
   As minimally and precisely as possible. Keep in mind we do not have access 
to your cluster or dags.
   
   If you are using kubernetes, please attempt to recreate the issue using 
minikube or kind.
   
   ## Install minikube/kind
   
   - Minikube https://minikube.sigs.k8s.io/docs/start/
   - Kind https://kind.sigs.k8s.io/docs/user/quick-start/
   
   If this is a UI bug, please provide a screenshot of the bug or a link to a 
youtube video of the bug in action
   
   You can include images using the .md style of
   ![alt text](http://url/to/img.png)
   
   To record a screencast, mac users can use QuickTime and then create an 
unlisted youtube video with the resulting .mov file.
   
   --->
   
   
   **Anything else we need to know**:
   
   <!--
   
   How often does this problem occur? Once? Every time etc?
   
   Any relevant logs to include? Put them here in side a detail tag:
   <details><summary>x.log</summary> lots of stuff </details>
   
   -->
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to