I am trying to change spark to support hive-0.13, but always met following
problem when running the test. My feeling is the test setup may need to
change, but don't know exactly. Who has the similar issue or is able to shed
light on it?

13:50:53.331 ERROR org.apache.hadoop.hive.ql.Driver: FAILED:
SemanticException [Error 10072]: Database does not exist: default
org.apache.hadoop.hive.ql.parse.SemanticException: Database does not exist:
default
        at
org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.getDatabase(BaseSemanticAnalyzer.java:1302)
        at
org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.getDatabase(BaseSemanticAnalyzer.java:1291)
        at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeCreateTable(SemanticAnalyzer.java:9944)
        at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9180)
        at
org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:391)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:291)
        at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:944)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1009)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:880)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:870)
        at
org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:292)
        at
org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:266)
        at
org.apache.spark.sql.hive.test.TestHiveContext.runSqlHive(TestHive.scala:83)
        at
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
        at
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
        at
org.apache.spark.sql.hive.HiveContext$QueryExecution.stringResult(HiveContext.scala:405)
        at
org.apache.spark.sql.hive.test.TestHiveContext$SqlCmd$$anonfun$cmd$1.apply$mcV$sp(TestHive.scala:164)
        at
org.apache.spark.sql.hive.test.TestHiveContext$$anonfun$loadTestTable$2.apply(TestHive.scala:282)
        at
org.apache.spark.sql.hive.test.TestHiveContext$$anonfun$loadTestTable$2.apply(TestHive.scala:282)
        at
scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
        at
scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:34)
        at
org.apache.spark.sql.hive.test.TestHiveContext.loadTestTable(TestHive.scala:282)
        at
org.apache.spark.sql.hive.CachedTableSuite.<init>(CachedTableSuite.scala:28)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at java.lang.Class.newInstance(Class.java:374)
        at
org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:621)
        at sbt.ForkMain$Run$2.call(ForkMain.java:294)
        at sbt.ForkMain$Run$2.call(ForkMain.java:284)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.hive.ql.parse.SemanticException: Database does
not exist: default
        at
org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.getDatabase(BaseSemanticAnalyzer.java:1298)
        ... 35 more



--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/Spark-testsuite-error-for-hive-0-13-tp7807.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to