denimalpaca opened a new issue, #26046:
URL: https://github.com/apache/airflow/issues/26046

   ### Apache Airflow Provider(s)
   
   common-sql
   
   ### Versions of Apache Airflow Providers
   
   Using `apache-airflow-providers-common-sql==1.1.0`
   
   ### Apache Airflow version
   
   2.3.2
   
   ### Operating System
   
   Debian GNU/Linux 11 bullseye
   
   ### Deployment
   
   Astronomer
   
   ### Deployment details
   
   astro-runtime:5.0.5
   
   ### What happened
   
   The `isinstance()` method to check that the hook is a `DbApiHook` is 
breaking when a snowflake connection is passed to an operator's `conn_id` 
parameter, as the check finds an instance of `SnowflakeHook` and not 
`DbApiHook`.
   
   ### What you think should happen instead
   
   There should not be an error when subclasses of `DbApiHook` are used. This 
can be fixed by replacing `isinstance()` with something that checks the 
inheritance hierarchy.
   
   ### How to reproduce
   
   Run an operator from the common-sql provider with a Snowflake connection 
passed to `conn_id`.
   
   ### Anything else
   
   Occurs every time.
   Log:
   ```
   [2022-08-29, 19:10:42 UTC] {manager.py:49} ERROR - Failed to extract 
metadata The connection type is not supported by SQLColumnCheckOperator. The 
associated hook should be a subclass of `DbApiHook`. Got SnowflakeHook 
task_type=SQLColumnCheckOperator airflow_dag_id=complex_snowflake_transform 
task_id=quality_check_group_forestfire.forestfire_column_checks 
airflow_run_id=manual__2022-08-29T19:04:54.998289+00:00 
   Traceback (most recent call last):
     File 
"/usr/local/airflow/include/openlineage/airflow/extractors/manager.py", line 
38, in extract_metadata
       task_metadata = extractor.extract_on_complete(task_instance)
     File 
"/usr/local/airflow/include/openlineage/airflow/extractors/sql_check_extractors.py",
 line 26, in extract_on_complete
       return super().extract()
     File 
"/usr/local/airflow/include/openlineage/airflow/extractors/sql_extractor.py", 
line 50, in extract
       authority=self._get_authority(),
     File 
"/usr/local/airflow/include/openlineage/airflow/extractors/snowflake_extractor.py",
 line 57, in _get_authority
       return self.conn.extra_dejson.get(
     File 
"/usr/local/airflow/include/openlineage/airflow/extractors/sql_extractor.py", 
line 102, in conn
       self._conn = get_connection(self._conn_id())
     File 
"/usr/local/airflow/include/openlineage/airflow/extractors/sql_extractor.py", 
line 91, in _conn_id
       return getattr(self.hook, self.hook.conn_name_attr)
     File 
"/usr/local/airflow/include/openlineage/airflow/extractors/sql_extractor.py", 
line 96, in hook
       self._hook = self._get_hook()
     File 
"/usr/local/airflow/include/openlineage/airflow/extractors/snowflake_extractor.py",
 line 63, in _get_hook
       return self.operator.get_db_hook()
     File 
"/usr/local/lib/python3.9/site-packages/airflow/providers/common/sql/operators/sql.py",
 line 112, in get_db_hook
       return self._hook
     File "/usr/local/lib/python3.9/functools.py", line 969, in __get__
       val = self.func(instance)
     File 
"/usr/local/lib/python3.9/site-packages/airflow/providers/common/sql/operators/sql.py",
 line 95, in _hook
       raise AirflowException(
   airflow.exceptions.AirflowException: The connection type is not supported by 
SQLColumnCheckOperator. The associated hook should be a subclass of 
`DbApiHook`. Got SnowflakeHook
   ```
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to