srowen commented on a change in pull request #24521: [SPARK-27629][PySpark]
Prevent Unpickler from intervening each unpickling
URL: https://github.com/apache/spark/pull/24521#discussion_r280930247
##########
File path: core/src/main/scala/org/apache/spark/api/python/SerDeUtil.scala
##########
@@ -186,6 +186,9 @@ private[spark] object SerDeUtil extends Logging {
val unpickle = new Unpickler
iter.flatMap { row =>
val obj = unpickle.loads(row)
+ // `Opcodes.MEMOIZE` of Protocol 4 (Python 3.4+) will store objects in
internal map
+ // of `Unpickler`. This map is cleared when calling
`Unpickler.close()`.
+ unpickle.close()
Review comment:
I don't know anything about this, but it looks odd to close the object
repeatedly. It may not cause a problem now. What's the downside to using a new
object for each row, just performance?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]