[ 
https://issues.apache.org/jira/browse/FLINK-37097?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17912394#comment-17912394
 ] 

Sergey Nuyanzin edited comment on FLINK-37097 at 1/13/25 8:13 AM:
------------------------------------------------------------------

I agree about moving forward with releasing hive connector based on 1.20 

Since I create a PR to support 1.20 [1], would be great if someone approves it

After that I can start ML discussion about another iteration to release 
external Hive connector and prepare an RC

[~luoyuxia] 
{quote}
Hi, about this issue, I'm wondering the  failures is thrown by hive-connector 
or start hive server instance used for testing
{quote}
it seems there are different issue
for instance the one that I mentioned is failing during 
{{HiveFunctionWrapperTest.testDeserializeUDF}} where we don't have hive server 
instance.

We could turn off specific tests (with special annotation like we do it in main 
repo in case of jdk17 and 21) for jdk11 and then mention in README like in case 
of jdk11 please test it carefully before use since there is no full 
support/coverage

[1] https://github.com/apache/flink-connector-hive/pull/20


was (Author: sergey nuyanzin):
I agree about moving forward with releasing hive connector based on 1.20 

Since I create a PR to support 1.20 [1], would be great if someone approves it

After that I can start ML discussion about another iteration to release 
external Hive connector and prepare an RC

[~lincoln.86xy] 
{quote}
Hi, about this issue, I'm wondering the  failures is thrown by hive-connector 
or start hive server instance used for testing
{quote}
it seems there are different issue
for instance the one that I mentioned is failing during 
{{HiveFunctionWrapperTest.testDeserializeUDF}} where we don't have hive server 
instance.

We could turn off specific tests (with special annotation like we do it in main 
repo in case of jdk17 and 21) for jdk11 and then mention in README like in case 
of jdk11 please test it carefully before use since there is no full 
support/coverage

[1] https://github.com/apache/flink-connector-hive/pull/20

> Remove Hive connector from core Flink
> -------------------------------------
>
>                 Key: FLINK-37097
>                 URL: https://issues.apache.org/jira/browse/FLINK-37097
>             Project: Flink
>          Issue Type: Technical Debt
>          Components: Connectors / Hive
>    Affects Versions: 2.0.0
>            Reporter: david radley
>            Assignee: david radley
>            Priority: Minor
>
> As per [https://github.com/apache/flink/pull/25947] the Hive code has been 
> externalized into a new repo. We should remove flink-connector-hive and the 
> flink-sql-connector-hive-* maven modules from core flink .
>  
> I am happy to make this change if someone can assign me the Jira.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to