in the next releases…
From: Campagnola, Francesco
Sent: martedì 6 settembre 2016 09:46
To: 'Jeff Zhang' <zjf...@gmail.com>
Cc: user@spark.apache.org
Subject: RE: Spark 2.0.0 Thrift Server problem with Hive metastore
I mean I have installed Spark 2.0 in the same environment where S
:
org.apache.spark.sql.catalyst.expressions.GenericInternalRow cannot be cast to
org.apache.spark.sql.catalyst.expressions.UnsafeRow
From: Jeff Zhang [mailto:zjf...@gmail.com]
Sent: martedì 6 settembre 2016 02:50
To: Campagnola, Francesco <francesco.campagn...@anritsu.com>
Cc: user@spark.apache.org
Subject: Re: Spark 2.0.0 Thrift
Hi,
in an already working Spark - Hive environment with Spark 1.6 and Hive 1.2.1,
with Hive metastore configured on Postgres DB, I have upgraded Spark to the
2.0.0.
I have started the thrift server on YARN, then tried to execute from the
beeline cli or a jdbc client the following command: