Re: Sparklyr and idle executors

2018-03-16 Thread Florian Dewes
I set this from within R: config <- spark_config() config$spark.shuffle.service.enabled = "true" config$spark.dynamicAllocation.enabled = "true" config$spark.dynamicAllocation.executorIdleTimeout = 120 config$spark.dynamicAllocation.maxExecutors = 80 sc <- spark_connect(master = “yarn_client",

Re: Sparklyr and idle executors

2018-03-16 Thread Femi Anthony
I assume you're setting these values in spark-defaults.conf. What happens if you specify them directly to spark-submit as in --conf spark.dynamicAllocation.enabled=true ? On Thu, Mar 15, 2018 at 1:47 PM, Florian Dewes wrote: > Hi all, > > I am currently trying to enable

Sparklyr and idle executors

2018-03-15 Thread Florian Dewes
Hi all, I am currently trying to enable dynamic resource allocation for a little yarn managed spark cluster. We are using sparklyr to access spark from R and have multiple jobs which should run in parallel, because some of them take several days to complete or are in development. Everything