viirya commented on a change in pull request #24070: [SPARK-23961][PYTHON] Fix
error when toLocalIterator goes out of scope
URL: https://github.com/apache/spark/pull/24070#discussion_r273510039
##########
File path: core/src/main/scala/org/apache/spark/api/python/PythonRDD.scala
##########
@@ -168,7 +168,42 @@ private[spark] object PythonRDD extends Logging {
}
def toLocalIteratorAndServe[T](rdd: RDD[T]): Array[Any] = {
- serveIterator(rdd.toLocalIterator, s"serve toLocalIterator")
Review comment:
Once the local iterator at Python side is out of scope and so the iterator
is not fully consumed, will it block the write call at Scala? Seems to me that
it will and we shouldn't see unneeded jobs to be triggered after that, doesn't?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]