Github user ilganeli commented on a diff in the pull request:

    https://github.com/apache/spark/pull/4022#discussion_r23095446
  
    --- Diff: docs/programming-guide.md ---
    @@ -1316,7 +1316,35 @@ For accumulator updates performed inside <b>actions 
only</b>, Spark guarantees t
     will only be applied once, i.e. restarted tasks will not update the value. 
In transformations, users should be aware 
     of that each task's update may be applied more than once if tasks or job 
stages are re-executed.
     
    +In addition, accumulators do not maintain lineage for the operations that 
use them. Consequently, accumulator updates are not guaranteed to be executed 
when made within a lazy transformation like `map()`. Unless something has 
triggered the evaluation of the lazy transformation that updates the value of 
the accumlator, subsequent operations will not themselves trigger that 
evaluation and the value of the accumulator will remain unchanged. The below 
code fragment demonstrates this issue:
    --- End diff --
    
    Thanks for the suggestion - I've updated the doc.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to