alexandermalyga opened a new issue, #25937:
URL: https://github.com/apache/airflow/issues/25937

   ### Apache Airflow Provider(s)
   
   trino
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-trino==4.0.0
   
   ### Apache Airflow version
   
   2.3.3
   
   ### Operating System
   
   macOS 12.5.1 (21G83)
   
   ### Deployment
   
   Docker-Compose
   
   ### Deployment details
   
   _No response_
   
   ### What happened
   
   `TrinoHook.insert_rows()` throws a syntax error due to the underlying 
prepared statement using "%s" as representation for parameters, instead of "?" 
[which Trino uses](https://trino.io/docs/current/sql/prepare.html#description).
   
   ### What you think should happen instead
   
   `TrinoHook.insert_rows()` should insert rows using Trino-compatible SQL 
statements.
   
   The following exception is raised currently:
   `trino.exceptions.TrinoUserError: TrinoUserError(type=USER_ERROR, 
name=SYNTAX_ERROR, message="line 1:88: mismatched input '%'. Expecting: ')', 
<expression>, <query>", query_id=xxx)`
   
   ### How to reproduce
   
   Instantiate an `airflow.providers.trino.hooks.trino.TrinoHook` instance and 
use it's `insert_rows()` method.
   Operators using this method internally are also broken: e.g. 
`airflow.providers.trino.transfers.gcs_to_trino.GCSToTrinoOperator`
   
   ### Anything else
   
   The issue seems to come from `TrinoHook.insert_rows()` relying on 
`DbApiHook.insert_rows()`, which uses "%s" to represent query parameters.
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to