RamesChan opened a new issue, #44902:
URL: https://github.com/apache/airflow/issues/44902

   ### Apache Airflow Provider(s)
   
   odbc
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-odbc==4.7.0
   
   
   ### Apache Airflow version
   
   2.7.1 and 2.10.2
   
   ### Operating System
   
   PRETTY_NAME="Debian GNU/Linux 12 (bookworm)" NAME="Debian GNU/Linux" 
VERSION_ID="12" VERSION="12 (bookworm)" VERSION_CODENAME=bookworm ID=debian 
HOME_URL="https://www.debian.org/"; SUPPORT_URL="https://www.debian.org/support"; 
BUG_REPORT_URL="https://bugs.debian.org/";
   
   ### Deployment
   
   Docker-Compose
   
   ### Deployment details
   
   I build arirflow from image tag 2.7.1 and 2.10.2. both image use the same 
Dockerfile just change the image tag. this Dockerfile will install IBM Netezza 
Datawarehouse ODBC Driver and copy unixODBC configulation files.
   
   **Dockerfile**
   ```
   FROM apache/airflow:2.10.2  #<---change tag.
   
   USER root
   RUN apt-get update
   RUN apt install -y \
       nano \
       procps \
       unixodbc \
       build-essential \
       unzip
   RUN rm -rf /var/lib/apt/lists/*
   
   # ENV JAVA_HOME /usr/lib/jvm/java-17-openjdk-amd64
   ENV LD_LIBRARY_PATH /usr/local/nz/lib64
   
   COPY ./odbc/* /opt/nz_odbc
   
   WORKDIR /opt/nz_odbc
   RUN tar -xvzf nps-linuxclient-v11.2.1.10.tar.gz
   
   WORKDIR /opt/nz_odbc/linux64
   RUN ./unpack
   
   COPY ./odbc/config/odbc.ini /etc/odbc.ini
   COPY ./odbc/config/odbc.ini /usr/local/etc/odbc.ini
   COPY ./odbc/config/odbcinst.ini /etc/odbcinst.ini
   COPY ./odbc/config/odbcinst.ini /usr/local/etc/odbcinst.ini
   
   USER airflow
   
   RUN pip install apache-airflow apache-airflow-providers-odbc
   
   WORKDIR /opt/
   ```
   
   ### What happened
   
   After that I run compose up and then set connection name "nz_odbc" like this 
image below.
   
![image](https://github.com/user-attachments/assets/df3efb2a-20d8-4205-8e82-0ff4b542fb31)
   
   and then use this python script to run DAG create table in Netezza.
   ```
   from airflow import DAG
   from airflow.operators.python import PythonOperator
   from datetime import datetime, timedelta
   from airflow.providers.odbc.hooks.odbc import OdbcHook
   
   default_args = {
       'owner': 'MyDAG',
       'depends_on_past': False,
       'start_date': datetime(2024, 12, 13),
       'retries': 0,
       'retry_delay': timedelta(minutes=5),
   }
   
   dag = DAG(
       dag_id="create_table",
       default_args=default_args,
       schedule_interval="@daily"
   )
   
   def get_data_from_system_i():
       
       odbc_hook = OdbcHook(odbc_conn_id='nz_odbc')
       
       connection = odbc_hook.get_conn()
   
       # Set encoding (optional, depends on your driver/database)
       connection.setdecoding(pyodbc.SQL_CHAR, encoding='utf-8')
       connection.setdecoding(pyodbc.SQL_WCHAR, encoding='utf-8')
       connection.setdecoding(pyodbc.SQL_WMETADATA, encoding='utf-8')
       connection.setencoding(encoding='utf-8')
   
       cursor = connection.cursor()
       
       cursor.execute("""
           CREATE TABLE EMPLOYEE (
               id int,
               name VARCHAR(50),
               department VARCHAR(50)
           )
       """)
       
       connection.commit() 
       
       cursor.close()
       connection.close()
   
   create_table = PythonOperator(
       task_id='get_customer_data',
       python_callable=get_data_from_system_i,
       dag=dag
   )
   
   create_table
   ```
   
   then it have error when I use airflow version 2.10.2 but success in version 
2.7.1.
   
   the error of airflow:2.10.2
   
![image](https://github.com/user-attachments/assets/3154431a-c38d-4ec7-9232-6b82b3942f2b)
   
   
   ### What you think should happen instead
   
   It's Should success like airflow:2.7.1
   
   ### How to reproduce
   
   1. install netezzaODBC driver
   2. test configulation is collect by **isql**
   3. run airflow dag
   
   ### Anything else
   
   - when I try to write python script by use pyodbc and then run script in 
container by command "python3 main.py" in airflow2.10.2 show same error that 
happen in airflow log. but airflow 2.7.1 success.
   - I try to install everything in ubuntu20.04 (maybe same version of 
airflow:2.7.1) and 22.04 (maybe same version of airflow:2.10.2) and then run 
the pyodbc script. it's show the same error like above.
   - that mean I think ubuntu version is not nessesary. maybe airflow 2.7.1 
have some configulation or installation different from others.
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to