Hi

Is there a way in Spark to run a function on each executor just once. I have
a couple of use cases. 

a) I use an external library that is a singleton. It keeps some global state
and provides some functions to manipulate it (e.g. reclaim memory. etc.) . I
want to check the global state of this library on each executor. 

b) To get jvm stats or instrumentation on each executor.

Currently I have a crude way of achieving something similar, I just run a
map on a large RDD that is hash partitioned, this does not guarantee that
the job would run just once.

Deenar



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Running-a-task-once-on-each-executor-tp3203.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to