Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/19848#discussion_r154158671
--- Diff:
core/src/test/scala/org/apache/spark/rdd/PairRDDFunctionsSuite.scala ---
@@ -524,6 +525,13 @@ class PairRDDFunctionsSuite extends SparkFunSuite with
SharedSparkContext {
pairs.saveAsNewAPIHadoopFile[ConfigTestFormat]("ignored")
}
+ test("The JobId on driver and executor should be the same during the
commit") {
+ // Create more than one rdd to mimic stageId not equal to rddId
+ val pairs = sc.parallelize(Array((1, 2), (2, 3)), 2).
+ map { p => (new Integer(p._1 + 1), new Integer(p._2 + 1)) }.filter {
p => p._1 > 0 }
+ pairs.saveAsNewAPIHadoopFile[YetAnotherFakeFormat]("ignored")
+ }
--- End diff --
Add `assert(JobID.jobid != -1)` to make sure the test code is actually
running.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]