frodo2000 opened a new issue, #36277:
URL: https://github.com/apache/airflow/issues/36277

   ### Apache Airflow version
   
   2.7.3
   
   ### If "Other Airflow 2 version" selected, which one?
   
   _No response_
   
   ### What happened?
   
   Due to lack of "parameters" value in template_field variable - there is no 
way to use templated value (ex operator.output)
   
   ### What you think should happen instead?
   
   Snowflake Operator should work as same as SQLExecuteQueryOperator
   
   ### How to reproduce
   
   `import airflow
   from airflow import DAG
   
   from airflow.models import Variable
   from airflow.utils.task_group import TaskGroup
   from airflow.operators.dummy import DummyOperator
   from airflow.operators.python_operator import PythonOperator
   
   from airflow.providers.common.sql.operators.sql import 
SQLExecuteQueryOperator 
   from airflow.providers.snowflake.operators.snowflake import SnowflakeOperator
   
   from dateutil import parser
   from datetime import datetime, timedelta, date, time
   import pendulum
   
   local_tz = pendulum.timezone("Europe/Warsaw")
   
   default_args = {
       'depends_on_past': False,
       'retries': 1, 
       'provide_context': True,
   }
   
   with DAG(
           dag_id='dag_test_snowflake_params',
           schedule_interval=None,
           max_active_runs=1,
           catchup=True,
           dagrun_timeout=timedelta(minutes=420),
           start_date=pendulum.datetime(2023, 1, 3, tz=local_tz),
           default_args=default_args
   ) as dag:
       start_task = DummyOperator(
           task_id='start_task'
       )
       
       def test_function():
           return 10
   
       python_task = PythonOperator(
           task_id='python_task',
           python_callable=test_function
       )
   
       snowflake_task = SnowflakeOperator(
           task_id='snowflake_task',
           sql='select %(test_param)s',
           snowflake_conn_id='snowflake_conn',
           parameters={
               "test_param":"{}".format(python_task.output)
           }
       )
   
       sqlexecutequery_task = SQLExecuteQueryOperator(
           task_id='sqlexecutequery_task',
           sql='select %(test_param)s',
           conn_id='snowflake_conn',
           parameters={
               "test_param":"{}".format(python_task.output)
           }
       )
   
       start_task >> python_task >> snowflake_task >> sqlexecutequery_task
   `
   
   ### Operating System
   
   Ubuntu
   
   ### Versions of Apache Airflow Providers
   
   Tested on snowflake-provider 5.0.1 and 5.2.0 versions
   
   ### Deployment
   
   Virtualenv installation
   
   ### Deployment details
   
   _No response_
   
   ### Anything else?
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to