Github user JoshRosen commented on a diff in the pull request:
https://github.com/apache/spark/pull/21383#discussion_r189818845
--- Diff: python/pyspark/shuffle.py ---
@@ -67,6 +67,19 @@ def get_used_memory():
return 0
+def safe_iter(f):
+ """ wraps f to make it safe (= does not lead to data loss) to use
inside a for loop
--- End diff --
It sounds like this is a potential correctness issue, so the eventual fix
for this should be backported to maintenance releases (at least the most recent
ones and the next 2.3.x).
I saw the examples provided on the linked JIRAs, but do you have an example
of a realistic user workload where this problem can occur (i.e. a case where
the problem is more subtle than explicitly throwing `StopIteration()`)? Would
that be something like calling `next()` past the end of an iterator (which I
suppose could occur deep in library code)?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]