mag3141592 opened a new issue, #27976:
URL: https://github.com/apache/airflow/issues/27976

   ### Apache Airflow Provider(s)
   
   common-sql
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-google==8.2.0
   apache-airflow-providers-http==4.0.0
   apache-airflow-providers-salesforce==5.0.0
   apache-airflow-providers-slack==5.1.0
   apache-airflow-providers-snowflake==3.2.0
   
   Issue:
   apache-airflow-providers-common-sql==1.3.0
   
   
   ### Apache Airflow version
   
   2.4.3
   
   ### Operating System
   
   Debian GNU/Linux 11 (bullseye)
   
   ### Deployment
   
   Astronomer
   
   ### Deployment details
   
   _No response_
   
   ### What happened
   
   Problem occurred when upgrading from common-sql=1.2.0 to common-sql=1.3.0
   
   
   Getting a `KEY_ERROR` when running a unique_check and null_check on a column.
   
   1.3.0 log:
   <img width="1609" alt="Screen Shot 2022-11-28 at 2 01 20 PM" 
src="https://user-images.githubusercontent.com/15257610/204390144-97ae35b7-1a2c-4ee1-9c12-4f3940047cde.png";>
   
   1.2.0 log:
   <img width="1501" alt="Screen Shot 2022-11-28 at 2 00 15 PM" 
src="https://user-images.githubusercontent.com/15257610/204389994-7e8eae17-a346-41ac-84c4-9de4be71af20.png";>
   
   
   ### What you think should happen instead
   
   Potential causes:
   - seems to be indexing based on the test query column `COL_NAME` instead of 
the table column `STRIPE_ID`
   - the `record` from the test changed types went from a tuple to a list of 
dictionaries.
   - no `tolerance` is specified for these tests, so `.get('tolerance')` looks 
like it will cause an error without a default specified like `.get('tolerance', 
None)`
   
   Expected behavior:
   - these tests continue to pass with the upgrade
   - `tolerance` is not a required key.
   
   ### How to reproduce
   
   ```
   from datetime import datetime
   from airflow import DAG
   
   from airflow.providers.snowflake.operators.snowflake import SnowflakeOperator
   from airflow.providers.common.sql.operators.sql import SQLColumnCheckOperator
   
   my_conn_id = "snowflake_default"
   
   default_args={"conn_id": my_conn_id}
   
   with DAG(
       dag_id="airflow_providers_example",
       schedule=None,
       start_date=datetime(2022, 11, 27),
       default_args=default_args,
   ) as dag:
   
       create_table = SnowflakeOperator(
           task_id="create_table",
           sql=""" CREATE OR REPLACE TABLE testing AS (
                           SELECT
                               1 AS row_num,
                               NULL AS field
   
                           UNION ALL
   
                           SELECT
                               2 AS row_num,
                               'test' AS field
   
                           UNION ALL
   
                           SELECT
                               3 AS row_num,
                               'test' AS field
                       )""",
       )
   
       column_checks = SQLColumnCheckOperator(
           task_id="column_checks",
           table="testing",
           column_mapping={
               "field": {"unique_check": {"equal_to": 0}, "null_check": 
{"equal_to": 0}}
           },
       )
   
       create_table >> column_checks
   
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to