[ 
https://issues.apache.org/jira/browse/AIRFLOW-5018?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yun Xu updated AIRFLOW-5018:
----------------------------
    Description: 
[https://github.com/apache/airflow/blob/master/airflow/operators/mysql_to_hive.py#L107-L108]

Currently in airflow MySqlToHiveTransfer, the type mapping for LONG and 
LONGLONG look like to be LONG -> BIGINT, LONGLONG -> DECIMAL(38,0). However, 
based on MySql code types, should they be LONG -> INT, LONGLONG -> BIGINT ? 
[https://dev.mysql.com/doc/refman/8.0/en/c-api-prepared-statement-type-codes.html]

Correct me if i understand wrong here. Thanks!

 

  was:
[https://github.com/apache/airflow/blob/master/airflow/operators/mysql_to_hive.py#L107-L108]

Currently in airflow MySqlToHiveTransfer, the type mapping for LONG and 
LONGLONG look like to be LONG -> BIGINT, LONGLONG -> DECIMAL(38,0). However, 
based on MySql code types, should they be LONG -> INT, LONGLONG -> BIGINT ? 
[https://dev.mysql.com/doc/refman/8.0/en/c-api-prepared-statement-type-codes.html]

Correct me if i understand incorrectly.

 


> MySqlToHiveTransfer Operator type mapping issue
> -----------------------------------------------
>
>                 Key: AIRFLOW-5018
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-5018
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: operators
>    Affects Versions: 1.10.3
>            Reporter: Yun Xu
>            Priority: Major
>
> [https://github.com/apache/airflow/blob/master/airflow/operators/mysql_to_hive.py#L107-L108]
> Currently in airflow MySqlToHiveTransfer, the type mapping for LONG and 
> LONGLONG look like to be LONG -> BIGINT, LONGLONG -> DECIMAL(38,0). However, 
> based on MySql code types, should they be LONG -> INT, LONGLONG -> BIGINT ? 
> [https://dev.mysql.com/doc/refman/8.0/en/c-api-prepared-statement-type-codes.html]
> Correct me if i understand wrong here. Thanks!
>  



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)

Reply via email to