raphaelauv commented on issue #31801:
URL: https://github.com/apache/airflow/issues/31801#issuecomment-2484574702

   since airlfow 2.10.0
   
   you can do 
   
   
   ```python
   from pathlib import Path
   
   from airflow.models import DAG
   from airflow.utils.dates import days_ago
   from airflow.providers.common.sql.operators.sql import 
SQLExecuteQueryOperator
   
   with DAG(dag_id="toto",
            start_date=days_ago(1),
            schedule_interval="@daily"):
   
   
       def generate_sql_query(context, jinja_env) -> list[str]:
           with Path(__file__).with_name('toto.sql').open('r') as f:
               query = f.read().format(something="TOTO", logical_date="{{ ds 
}}")
               query = context["task"].render_template(query, context, 
jinja_env)
               return query.split(";")
   
   
       MySQLExecuteQueryOperator(
           task_id="something",
           sql=generate_sql_query,
           conn_id='toto',
       )
   
   ```
   
   sql file toto.sql in same path than the dag file
   ```sql
   select
       *,
       date('{logical_date}')
   from toto_{something}
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to