[ 
https://issues.apache.org/jira/browse/SPARK-8147?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14576220#comment-14576220
 ] 

Sean Owen commented on SPARK-8147:
----------------------------------

Hm, how does a wrapper manage this though? it can't change memory allocation in 
the underlying collection. What would the semantics of the wrapper be? You can 
already wrap iterators within your own function too, so what does this add?

> Add ability to decorate RDD iterators
> -------------------------------------
>
>                 Key: SPARK-8147
>                 URL: https://issues.apache.org/jira/browse/SPARK-8147
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 1.3.1
>            Reporter: Yuri Makhno
>
> In Spark all computations are done through iterators which are created by 
> RDD.iterator method. It would be good if we can specify some 
> RDDIteratorDecoratorFactory in SparkConf and be able to decorate all RDD 
> iterators created in executor JVM. 
> For us it would be extremely useful because we want to control executor's 
> memory and prevent OutOfMemory on executor but instead fail job with 
> NotEnoughMemory reason in case when we see that we don't have more memory to 
> do this. Also we want to collect some computation statistics on executor.
> I can provide PR in case this improvement is approved.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to