We have dropped support for Hive 1.x, a while back. Would you be able to
move to Hive 2.x?

IIRC there were some workarounds discussed on this thread before. But,
given the push towards Hive 3.x, its good to be on 2.x atleast ..
Let me know and we can go from there :)

On Sun, Mar 1, 2020 at 1:09 PM selvaraj periyasamy <
selvaraj.periyasamy1...@gmail.com> wrote:

> I am using Hudi 0.5.0 and then write using sparkwriter.
>
> My spark version is 2.3.0
> Scala version 2.11.8
> Hive version 1.2.2
>
> Write is success but hive call is failing. When checked some google
> reference, It seems to be an hive client is higher version the server.
> Since Hudi is built on hive 2.3.1, Is there a way to use 1.2.2?
>
> 2020-03-01 12:16:50 WARN  HoodieSparkSqlWriter$:110 - hoodie dataset at
> hdfs://localhost:9000/projects/cdp/data/attunity_poc/attunity_rep_base
> already exists. Deleting existing data & overwriting with new data.
> [Stage 111:============================>
>                                                          2020-03-01
> 12:16:51 ERROR HiveConnection:697 - Error opening session
> org.apache.thrift.TApplicationException: Required field 'client_protocol'
> is unset! Struct:TOpenSessionReq(client_protocol:null,
>
> configuration:{set:hiveconf:hive.server2.thrift.resultset.default.fetch.size=1000,
> use:database=default})
> at
>
> org.apache.thrift.TApplicationException.read(TApplicationException.java:111)
> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:79)
> at
>
> org.apache.hudi.org.apache.hive.service.rpc.thrift.TCLIService$Client.recv_OpenSession(TCLIService.java:168)
> at
>
> org.apache.hudi.org.apache.hive.service.rpc.thrift.TCLIService$Client.OpenSession(TCLIService.java:155)
>
>
> Thanks,
> Selva
>

Reply via email to