frodo2000 opened a new issue, #32551:
URL: https://github.com/apache/airflow/issues/32551
### Apache Airflow version
Other Airflow 2 version (please specify below)
### What happened
We are using Airflow 2.6.0 with Airflow Snowflake Provider 4.3.0.
When we add database, schema and warehouse parameters to SnowflakeOperator
all are overriding extra part of Snowflake connection definition. Same set of
parameters in SnowflakeValueCheckOperator none of parameter is overriden.
### What you think should happen instead
When we go through Snowflake Provider source code we found, that for
SnowflakeOperator hooks_params are created before parent class init. it is
looked like:
` if any([warehouse, database, role, schema, authenticator,
session_parameters]):
hook_params = kwargs.pop("hook_params", {})
kwargs["hook_params"] = {
"warehouse": warehouse,
"database": database,
"role": role,
"schema": schema,
"authenticator": authenticator,
"session_parameters": session_parameters,
**hook_params,
}
super().__init__(conn_id=snowflake_conn_id, **kwargs)`
For SnowflakeValueCheckOperator parent class init is added before
initialization of class arguments:
`super().__init__(sql=sql, parameters=parameters, conn_id=snowflake_conn_id,
**kwargs)
self.snowflake_conn_id = snowflake_conn_id
self.sql = sql
self.autocommit = autocommit
self.do_xcom_push = do_xcom_push
self.parameters = parameters
self.warehouse = warehouse
self.database = database`
Probably hook that is used in SnowflakeValueCheckOperator (and probably in
the rest of classes) is initiated base on connection values and overriding is
not working.
### How to reproduce
We should create connection with different database and warehouse than
TEST_DB and TEST_WH. Table dual should exist only in TEST_DB.TEST_SCHEMA and
not exists in connection db/schema.
from pathlib import Path
from datetime import timedelta, datetime
from time import time, sleep
from airflow import DAG
from airflow.providers.snowflake.operators.snowflake import SnowflakeOperator
from airflow.providers.snowflake.operators.snowflake import
SnowflakeValueCheckOperator
warehouse = 'TEST_WH'
database ='TEST_DB'
schema = 'TEST_SCHEMA'
args = {
'owner': 'airflow',
'depends_on_past': False,
'email_on_failure': True,
'email_on_retry': False,
'start_date': pendulum.now(tz='Europe/Warsaw').add(months=-1),
'retries': 0,
'concurrency': 10,
'dagrun_timeout': timedelta(hours=24)
}
with DAG(
dag_id=dag_id,
template_undefined=jinja2.Undefined,
default_args=args,
description='Sequence ' + sequence_id,
schedule=schedule,
max_active_runs=10,
catchup=False,
tags=tags
) as dag:
value_check_task = SnowflakeValueCheckOperator(
task_id='value_check_task',
sql='select 1 from dual',
snowflake_conn_id ='con_snowflake_zabka',
warehouse=warehouse,
database=database,
schema=schema,
pass_value=1
)
snowflake_export_data_task = SnowflakeOperator(
task_id='snowflake_export_data_task',
snowflake_conn_id='con_snowflake',
sql=f"select 1 from dual",
warehouse=warehouse,
database=database,
schema=schema
)
### Operating System
Ubuntu 20.04.5 LTS
### Versions of Apache Airflow Providers
apache-airflow 2.6.0
apache-airflow-providers-celery 3.1.0
apache-airflow-providers-common-sql 1.4.0
apache-airflow-providers-ftp 3.3.1
apache-airflow-providers-http 4.3.0
apache-airflow-providers-imap 3.1.1
apache-airflow-providers-microsoft-azure 6.1.1
apache-airflow-providers-odbc 3.2.1
apache-airflow-providers-oracle 3.6.0
apache-airflow-providers-postgres 5.4.0
apache-airflow-providers-redis 3.1.0
apache-airflow-providers-snowflake 4.3.0
apache-airflow-providers-sqlite 3.3.2
apache-airflow-providers-ssh 3.6.0
### Deployment
Virtualenv installation
### Deployment details
Python 3.9.5
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]