guptaaanchal11 opened a new issue, #32712:
URL: https://github.com/apache/airflow/issues/32712
### Apache Airflow version
Other Airflow 2 version (please specify below)
### What happened
Airflow version: 2.5.3
Description: While using BigQueryCreateExternalTableOperator, when we to
create external table on top of parquet files located in GCS, airflow shows
below error:
google.api_core.exceptions.BadRequest: 400 POST ... failed. CsvOptions can
only be specified if storage format is CSV.
No csvOptions are mentioned while passing the source_format as Parquet.
There seems to be an issue with bigquery.py hook.
### What you think should happen instead
_No response_
### How to reproduce
While using BigQueryCreateExternalTableOperator, try to create external
table on top of parquet files located in GCS. Sample usage:
`create_external_table = BigQueryCreateExternalTableOperator(
task_id="create_external_table",
destination_project_dataset_table="table_name",
bucket=GCS_BUCKET_NAME,
source_objects=[<list of URIs of parquet files>],
source_format="PARQUET",
autodetect=True,
gcp_conn_id="gcp_conn_id",
google_cloud_storage_conn_id="test_conn_id",
)`
### Operating System
-
### Versions of Apache Airflow Providers
_No response_
### Deployment
Official Apache Airflow Helm Chart
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]