sodafountain opened a new issue #14871:
URL: https://github.com/apache/airflow/issues/14871


   
   **Apache Airflow version**: 2.0.1
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl 
version`): 1.18.14 (but doesn't matter, when I run using docker-compose on my 
local machine as well)
   
   **Environment**: 
   
   - **Cloud provider or hardware configuration**: Azure
   - **OS** (e.g. from /etc/os-release):
   - **Kernel** (e.g. `uname -a`):
   - **Install tools**:
   - **Others**:
   
   **What happened**:
   Provider `PostgresToGCSOperator` is unable to export the data to GCS in the 
parquet format
   
   **What you expected to happen**:
   Provider `PostgresToGCSOperator` should have exported the data to GCS in the 
parquet format.
   <!-- What do you think went wrong? -->
   
   The execution hit the following traceback
   
   <details>
   Traceback (most recent call last):
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/models/taskinstance.py",
 line 1112, in _run_raw_task
       self._prepare_and_execute_task_with_callbacks(context, task)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/models/taskinstance.py",
 line 1285, in _prepare_and_execute_task_with_callbacks
       result = self._execute_task(context, task_copy)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/models/taskinstance.py",
 line 1315, in _execute_task
       result = task_copy.execute(context=context)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/providers/google/cloud/transfers/sql_to_gcs.py",
 line 154, in execute
       files_to_upload = self._write_local_data_files(cursor)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/providers/google/cloud/transfers/sql_to_gcs.py",
 line 206, in _write_local_data_files
       parquet_schema = self._convert_parquet_schema(cursor)
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/providers/google/cloud/transfers/sql_to_gcs.py",
 line 280, in _convert_parquet_schema
       pq_types = [type_map.get(bq_type, pa.string()) for bq_type in bq_types]
     File 
"/home/airflow/.local/lib/python3.6/site-packages/airflow/providers/google/cloud/transfers/sql_to_gcs.py",
 line 280, in <listcomp>
       pq_types = [type_map.get(bq_type, pa.string()) for bq_type in bq_types]
   TypeError: unhashable type: 'dict'
   </details>
   
   
   **How to reproduce it**:
   
   I encountered this error when I tried to transfer any table data from 
Postgres to GCS in the Parquet format. My task definition is
   <details>
       upload_data_to_gcs = PostgresToGCSOperator(
           postgres_conn_id='azure-qa-replica-connection',
           gcp_conn_id='google-cloud-connection',
           task_id='upload_data_to_gcs',
           sql=QUERY,
           bucket=GCS_BUCKET,
           filename=f'{TABLE}_{{{{ ts_nodash }}}}',
           gzip=False,
           use_server_side_cursor=True,
           export_format='parquet',
       )   
   </details>
   
   **Anything else we need to know**:
   
   I was trying to dig through the code. This is my initial analysis, I could 
be wrong as well.
   In the file 
`airflow/airflow/providers/google/cloud/transfers/sql_to_gcs.py`, the method ` 
_convert_parquet_schema`, we try to map the postgres column types to parquet 
types.
   The line 
   ```
   bq_types = [self.field_to_bigquery(field) for field in cursor.description]
   ```
   tries to get the big query column types whose implementation in 
`postgres_to_gcs` is as follows:
   
   ```
       def field_to_bigquery(self, field) -> Dict[str, str]:
           return {
               'name': field[0],
               'type': self.type_map.get(field[1], "STRING"),
               'mode': 'REPEATED' if field[1] in (1009, 1005, 1007, 1016) else 
'NULLABLE',
           }
   ```
   As you can see this returns a dictionary, However in the next line we have
   ```
   pq_types = [type_map.get(bq_type, pa.string()) for bq_type in bq_types]
   ```
    which iterates through the list of dictionary and tried to find the 
corresponding parquet type. However since the iterable is on a dictionary, the 
code `type_map.get(bq_type, pa.string())` fails with the error `unhashable type 
dict`.
   
   IMO, the fix can be just to change that line to
   
   ```
   pq_types = [type_map.get(bq_type.get('type','STRING'), pa.string()) for 
bq_type in bq_types]
   ```
   
   
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to