Github user cloud-fan commented on a diff in the pull request:
https://github.com/apache/spark/pull/19763#discussion_r152020601
--- Diff: core/src/main/scala/org/apache/spark/MapOutputTracker.scala ---
@@ -472,16 +475,45 @@ private[spark] class MapOutputTrackerMaster(
shuffleStatuses.get(shuffleId).map(_.findMissingPartitions())
}
+ /**
+ * Try to equally divide Range(0, num) to divisor slices
+ */
+ def equallyDivide(num: Int, divisor: Int): Iterator[Seq[Int]] = {
+ assert(divisor > 0, "Divisor should be positive")
+ val (each, remain) = (num / divisor, num % divisor)
+ val (smaller, bigger) = (0 until num).splitAt((divisor-remain) * each)
--- End diff --
can you add some comment to describe the algorithm? I'd expect something
like:
```
to equally divide n elements to m buckets
each bucket should have n/m elements
for the remaining n%m elements
pick the first n/m buckets and add one more element
```
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]