Re: Dynamically change executors settings

2016-08-26 Thread linguin . m . s
Hi,

No, currently you can't change the setting. 

// maropu



2016/08/27 11:40、Vadim Semenov  のメッセージ:

> Hi spark users,
> 
> I wonder if it's possible to change executors settings on-the-fly.
> I have the following use-case: I have a lot of non-splittable skewed files in 
> a custom format that I read using a custom Hadoop RecordReader. These files 
> can be small & huge and I'd like to use only one-two cores per executor while 
> they get processed (to use the whole heap). But once they got processed I'd 
> like to enable all cores.
> I know that I can achieve this by splitting it into two separate jobs but I 
> wonder if it's possible to somehow achieve the behavior I described.
> 
> Thanks!

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Dynamically change executors settings

2016-08-26 Thread Vadim Semenov
Hi spark users,

I wonder if it's possible to change executors settings on-the-fly.
I have the following use-case: I have a lot of non-splittable skewed files
in a custom format that I read using a custom Hadoop RecordReader. These
files can be small & huge and I'd like to use only one-two cores per
executor while they get processed (to use the whole heap). But once they
got processed I'd like to enable all cores.
I know that I can achieve this by splitting it into two separate jobs but I
wonder if it's possible to somehow achieve the behavior I described.

Thanks!