Github user cloud-fan commented on a diff in the pull request:
https://github.com/apache/spark/pull/19848#discussion_r157111981
--- Diff:
core/src/main/scala/org/apache/spark/internal/io/SparkHadoopWriter.scala ---
@@ -60,17 +60,17 @@ object SparkHadoopWriter extends Logging {
config: HadoopWriteConfigUtil[K, V]): Unit = {
// Extract context and configuration from RDD.
val sparkContext = rdd.context
- val stageId = rdd.id
+ val commitJobId = rdd.id
// Set up a job.
val jobTrackerId = createJobTrackerID(new Date())
--- End diff --
`jobTrackerId` is also not unique, is that OK?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]