Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/11744#discussion_r56286574
--- Diff:
sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveSparkSubmitSuite.scala ---
@@ -87,13 +88,16 @@ class HiveSparkSubmitSuite
runSparkSubmit(args)
}
- ignore("SPARK-8489: MissingRequirementError during reflection") {
+ test("SPARK-8489: MissingRequirementError during reflection") {
// This test uses a pre-built jar to test SPARK-8489. In a nutshell,
this test creates
// a HiveContext and uses it to create a data frame from an RDD using
reflection.
// Before the fix in SPARK-8470, this results in a
MissingRequirementError because
// the HiveContext code mistakenly overrides the class loader that
contains user classes.
// For more detail, see
sql/hive/src/test/resources/regression-test-SPARK-8489/*scala.
- val testJar =
"sql/hive/src/test/resources/regression-test-SPARK-8489/test.jar"
+ import Properties.versionString
+ val version = versionString.substring(versionString.indexOf(" ") + 1,
--- End diff --
it might be better if you explicitly match 2.10 and 2.11, and throw
exceptions for other things. I think the error message then would be more
obvious when we introduce support for 2.12.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]