But it forces you to create your own SparkContext, which I’d rather not do.
Also it doesn’t seem to allow me to directly create a table from a DataFrame,
as follow:
TestHive.createDataFrame[MyType](rows).write.saveAsTable("a_table")
From: Xin Wu [mailto:[email protected]]
Sent: 13 janvier 2017 12:43
To: Nicolas Tallineau <[email protected]>
Cc: [email protected]
Subject: Re: [Spark SQL - Scala] TestHive not working in Spark 2
I used the following:
val testHive = new org.apache.spark.sql.hive.test.TestHiveContext(sc, false)
val hiveClient = testHive.sessionState.metadataHive
hiveClient.runSqlHive(“….”)
On Fri, Jan 13, 2017 at 6:40 AM, Nicolas Tallineau
<[email protected]<mailto:[email protected]>> wrote:
I get a nullPointerException as soon as I try to execute a TestHive.sql(...)
statement since migrating to Spark 2 because it's trying to load non existing
"test tables". I couldn't find a way to switch to false the loadTestTables
variable.
Caused by: sbt.ForkMain$ForkError: java.lang.NullPointerException: null
at
org.apache.spark.sql.hive.test.TestHiveSparkSession.getHiveFile(TestHive.scala:190)
at
org.apache.spark.sql.hive.test.TestHiveSparkSession.org<http://org.apache.spark.sql.hive.test.TestHiveSparkSession.org>$apache$spark$sql$hive$test$TestHiveSparkSession$$quoteHiveFile(TestHive.scala:196)
at
org.apache.spark.sql.hive.test.TestHiveSparkSession.<init>(TestHive.scala:234)
at
org.apache.spark.sql.hive.test.TestHiveSparkSession.<init>(TestHive.scala:122)
at
org.apache.spark.sql.hive.test.TestHiveContext.<init>(TestHive.scala:80)
at
org.apache.spark.sql.hive.test.TestHive$.<init>(TestHive.scala:47)
at
org.apache.spark.sql.hive.test.TestHive$.<clinit>(TestHive.scala)
I’m using Spark 2.1.0 in this case.
Am I missing something or should I create a bug in Jira?
--
Xin Wu
(650)392-9799