This should fix it: https://github.com/apache/spark/pull/16080
On Wed, Nov 30, 2016 at 10:55 AM, Timur Shenkao wrote:
> Hello,
>
> Yes, I used hiveContext, sqlContext, sparkSession from Java, Scala,
> Python.
> Via spark-shell, spark-submit, IDE (PyCharm, Intellij IDEA).
>
Hello,
Yes, I used hiveContext, sqlContext, sparkSession from Java, Scala, Python.
Via spark-shell, spark-submit, IDE (PyCharm, Intellij IDEA).
Everything is perfect because I have Hadoop cluster with configured & tuned
HIVE.
The reason of Michael's error is usually misconfigured or absent HIVE.
Hi Timur,
did you use hiveContext or sqlContext or the spark way mentioned in the
http://spark.apache.org/docs/latest/sql-programming-guide.html?
Regards,
Gourav Sengupta
On Wed, Nov 30, 2016 at 5:35 PM, Yin Huai wrote:
> Hello Michael,
>
> Thank you for reporting this
Hello Michael,
Thank you for reporting this issue. It will be fixed by
https://github.com/apache/spark/pull/16080.
Thanks,
Yin
On Tue, Nov 29, 2016 at 11:34 PM, Timur Shenkao wrote:
> Hi!
>
> Do you have real HIVE installation?
> Have you built Spark 2.1 & Spark 2.0 with
Hi!
Do you have real HIVE installation?
Have you built Spark 2.1 & Spark 2.0 with HIVE support ( -Phive
-Phive-thriftserver ) ?
It seems that you use "default" Spark's HIVE 1.2.1. Your metadata is stored
in local Derby DB which is visible to concrete Spark installation but not
for all.
On Wed,