Unless its a broadcast variable, a new copy will be deserialized for every
task.

On Wed, Oct 14, 2015 at 10:18 AM, Starch, Michael D (398M) <
michael.d.sta...@jpl.nasa.gov> wrote:

> All,
>
> Is a Function object in Spark reused on a given executor, or is sent and
> deserialized with each new task?
>
> On my project, we have functions that incur a very large setup cost, but
> then could be called many times.  Currently, I am using object
> deserialization to run this intensive setup,  I am wondering if this
> function is reused (within the context of the executor), or I am I
> deserializing this object over and over again for each task sent to a given
> worker.
>
> Are there other ways to share objects between tasks on the same executor?
>
> Many thanks,
>
> Michael
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to