[ https://issues.apache.org/jira/browse/SPARK-20926?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-20926: ------------------------------------ Assignee: Apache Spark > Exposure to Guava libraries by directly accessing tableRelationCache in > SessionCatalog caused failures > ------------------------------------------------------------------------------------------------------ > > Key: SPARK-20926 > URL: https://issues.apache.org/jira/browse/SPARK-20926 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 2.2.0 > Reporter: Reza Safi > Assignee: Apache Spark > > Because of shading that we did for guava libraries, we see test failures > whenever those components directly access tableRelationCache in > SessionCatalog. > This can happen in any component that shaded guava library. Failures looks > like this: > {noformat} > java.lang.NoSuchMethodError: > org.apache.spark.sql.catalyst.catalog.SessionCatalog.tableRelationCache()Lcom/google/common/cache/Cache; > 01:25:14 at > org.apache.spark.sql.hive.test.TestHiveSparkSession.reset(TestHive.scala:492) > 01:25:14 at > org.apache.spark.sql.hive.test.TestHiveContext.reset(TestHive.scala:138) > 01:25:14 at > org.apache.spark.sql.hive.test.TestHiveSingleton$class.afterAll(TestHiveSingleton.scala:32) > 01:25:14 at > org.apache.spark.sql.hive.StatisticsSuite.afterAll(StatisticsSuite.scala:34) > 01:25:14 at > org.scalatest.BeforeAndAfterAll$class.afterAll(BeforeAndAfterAll.scala:213) > 01:25:14 at org.apache.spark.SparkFunSuite.afterAll(SparkFunSuite.scala:31) > 01:25:14 at > org.scalatest.BeforeAndAfterAll$$anonfun$run$1.apply(BeforeAndAfterAll.scala:280) > 01:25:14 at > org.scalatest.BeforeAndAfterAll$$anonfun$run$1.apply(BeforeAndAfterAll.scala:278) > 01:25:14 at org.scalatest.CompositeStatus.whenCompleted(Status.scala:377) > 01:25:14 at > org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:278) > {noformat} -- This message was sent by Atlassian JIRA (v6.3.15#6346) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org