Patrick Wendell created SPARK-2264:
--------------------------------------

             Summary: CachedTableSuite SQL Tests are Failing
                 Key: SPARK-2264
                 URL: https://issues.apache.org/jira/browse/SPARK-2264
             Project: Spark
          Issue Type: Bug
            Reporter: Patrick Wendell
            Assignee: Michael Armbrust
            Priority: Blocker


{code}
[info] CachedTableSuite:
[info] - read from cached table and uncache *** FAILED ***
[info]   java.lang.RuntimeException: Table Not Found: testData
[info]   at scala.sys.package$.error(package.scala:27)
[info]   at 
org.apache.spark.sql.catalyst.analysis.SimpleCatalog$$anonfun$1.apply(Catalog.scala:64)
[info]   at 
org.apache.spark.sql.catalyst.analysis.SimpleCatalog$$anonfun$1.apply(Catalog.scala:64)
[info]   at scala.Option.getOrElse(Option.scala:120)
[info]   at 
org.apache.spark.sql.catalyst.analysis.SimpleCatalog.lookupRelation(Catalog.scala:64)
[info]   at org.apache.spark.sql.SQLContext.table(SQLContext.scala:185)
[info]   at 
org.apache.spark.sql.CachedTableSuite$$anonfun$1.apply$mcV$sp(CachedTableSuite.scala:43)
[info]   at 
org.apache.spark.sql.CachedTableSuite$$anonfun$1.apply(CachedTableSuite.scala:27)
[info]   at 
org.apache.spark.sql.CachedTableSuite$$anonfun$1.apply(CachedTableSuite.scala:27)
[info]   at 
org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
[info]   ...
[info] - correct error on uncache of non-cached table *** FAILED ***
[info]   Expected exception java.lang.IllegalArgumentException to be thrown, 
but java.lang.RuntimeException was thrown. (CachedTableSuite.scala:55)
[info] - SELECT Star Cached Table *** FAILED ***
[info]   java.lang.RuntimeException: Table Not Found: testData
[info]   at scala.sys.package$.error(package.scala:27)
[info]   at 
org.apache.spark.sql.catalyst.analysis.SimpleCatalog$$anonfun$1.apply(Catalog.scala:64)
[info]   at 
org.apache.spark.sql.catalyst.analysis.SimpleCatalog$$anonfun$1.apply(Catalog.scala:64)
[info]   at scala.Option.getOrElse(Option.scala:120)
[info]   at 
org.apache.spark.sql.catalyst.analysis.SimpleCatalog.lookupRelation(Catalog.scala:64)
[info]   at 
org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$1.applyOrElse(Analyzer.scala:67)
[info]   at 
org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$1.applyOrElse(Analyzer.scala:65)
[info]   at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:165)
[info]   at 
org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:183)
[info]   at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
[info]   ...
[info] - Self-join cached *** FAILED ***
[info]   java.lang.RuntimeException: Table Not Found: testData
[info]   at scala.sys.package$.error(package.scala:27)
[info]   at 
org.apache.spark.sql.catalyst.analysis.SimpleCatalog$$anonfun$1.apply(Catalog.scala:64)
[info]   at 
org.apache.spark.sql.catalyst.analysis.SimpleCatalog$$anonfun$1.apply(Catalog.scala:64)
[info]   at scala.Option.getOrElse(Option.scala:120)
[info]   at 
org.apache.spark.sql.catalyst.analysis.SimpleCatalog.lookupRelation(Catalog.scala:64)
[info]   at 
org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$1.applyOrElse(Analyzer.scala:67)
[info]   at 
org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$1.applyOrElse(Analyzer.scala:65)
[info]   at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:165)
[info]   at 
org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:183)
[info]   at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
[info]   ...
[info] - 'CACHE TABLE' and 'UNCACHE TABLE' SQL statement *** FAILED ***
[info]   java.lang.RuntimeException: Table Not Found: testData
[info]   at scala.sys.package$.error(package.scala:27)
[info]   at 
org.apache.spark.sql.catalyst.analysis.SimpleCatalog$$anonfun$1.apply(Catalog.scala:64)
[info]   at 
org.apache.spark.sql.catalyst.analysis.SimpleCatalog$$anonfun$1.apply(Catalog.scala:64)
[info]   at scala.Option.getOrElse(Option.scala:120)
[info]   at 
org.apache.spark.sql.catalyst.analysis.SimpleCatalog.lookupRelation(Catalog.scala:64)
[info]   at org.apache.spark.sql.SQLContext.cacheTable(SQLContext.scala:189)
[info]   at 
org.apache.spark.sql.execution.CacheCommand.sideEffectResult$lzycompute(commands.scala:110)
[info]   at 
org.apache.spark.sql.execution.CacheCommand.sideEffectResult(commands.scala:108)
[info]   at 
org.apache.spark.sql.execution.CacheCommand.execute(commands.scala:118)
[info]   at 
org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:322)
[info]   ...
{code}



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to