Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/19369#discussion_r141662253
--- Diff:
core/src/main/scala/org/apache/spark/storage/BlockReplicationPolicy.scala ---
@@ -85,11 +65,9 @@ object BlockReplicationUtils {
* randomly shuffle elems
*/
def getRandomSample[T](elems: Seq[T], m: Int, r: Random): List[T] = {
- if (elems.size > m) {
- getSampleIds(elems.size, m, r).map(elems(_))
- } else {
- r.shuffle(elems).toList
- }
+ // This takes linear space, but is stable wrt m. That is for a fixed
--- End diff --
Hm, but I presume the purpose of the specialized method is to handle the
case where elems is large. This isn't just test code so this might have a
significant impact. Can the existing approach not be modified to be stable, or
tests otherwise relaxed?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]