mtsadler-branch opened a new issue, #38681:
URL: https://github.com/apache/airflow/issues/38681
### Apache Airflow Provider(s)
snowflake
### Versions of Apache Airflow Providers
Unmentioned version lock: `snowflake-connector-python<3.5.0`
Upgrading to latest `snowflake-connector-python` creates issue with
`Timestamp`s rendering as unparsable `epochs` during `pandas_sql.read_query()`
### Apache Airflow version
2.8.4
### Operating System
N/A
### Deployment
Official Apache Airflow Helm Chart
### Deployment details
_No response_
### What happened
```
result = self.get_hook().get_pandas_df(query)
File
"/home/airflow/.local/lib/python3.9/site-packages/airflow/providers/common/sql/hooks/sql.py",
line 227, in get_pandas_df
return psql.read_sql(sql, con=conn, params=parameters, **kwargs)
File "/home/airflow/.local/lib/python3.9/site-packages/pandas/io/sql.py",
line 602, in read_sql
return pandas_sql.read_query(
File "/home/airflow/.local/lib/python3.9/site-packages/pandas/io/sql.py",
line 2116, in read_query
cursor = self.execute(*args)
File "/home/airflow/.local/lib/python3.9/site-packages/pandas/io/sql.py",
line 2068, in execute
raise ex from exc
pandas.io.sql.DatabaseError: Execution failed on sql '
...
Timestamp '(seconds_since_epoch=1712074200000000000)' is not recognized; 73)
```
### What you think should happen instead
_No response_
### How to reproduce
install:
```
snowflake-connector-python==3.5.0
apache-airflow-providers-snowflake==5.3.1
```
run:
```
from airflow.providers.snowflake.hooks.snowflake import SnowflakeHook
query = '...'
SnowflakeHook.get_pandas_df(query)
```
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]