William Benton created SPARK-1454:
-------------------------------------

             Summary: PySpark accumulators fail to update when runJob takes 
serialized/captured closures
                 Key: SPARK-1454
                 URL: https://issues.apache.org/jira/browse/SPARK-1454
             Project: Spark
          Issue Type: Bug
          Components: PySpark, Spark Core
    Affects Versions: 1.0.0
            Reporter: William Benton
            Priority: Minor


My patch for SPARK-729 optionally serializes closures when they are cleaned (in 
order to capture the values of mutable free variables at declaration time 
rather than at execution time).  This behavior is currently disabled for the 
closure argument to SparkContext.runJob, because enabling it there causes 
Python accumulators to fail to update.

The purpose of this JIRA is to note this issue and fix whatever is causing 
Python accumulators to behave this way so that closures passed to runJob can be 
captured in general.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to