Re: Elastic allocation(spark.dynamicAllocation.enabled) results in task never being executed.

2015-01-04 Thread Tsuyoshi Ozawa
Please check the document added by Andrew. I could run tasks with Spark 1.2.0. * https://github.com/apache/spark/pull/3731/files#diff-c3cbe4cabe90562520f22d2306aa9116R86 * https://github.com/apache/spark/pull/3757/files#diff-c3cbe4cabe90562520f22d2306aa9116R101 Thanks, - Tsuyoshi On Sun, Jan 4

Re: Dynamic Allocation in Spark 1.2.0

2014-12-27 Thread Tsuyoshi OZAWA
Hi Anders, I faced the same issue as you mentioned. Yes, you need to install spark shuffle plugin for YARN. Please check following PRs which add doc to enable dynamicAllocation: https://github.com/apache/spark/pull/3731 https://github.com/apache/spark/pull/3757 I could run Spark on YARN with dyn

Re: resource allocation spark on yarn

2014-12-12 Thread Tsuyoshi OZAWA
Hi, In addition to the options Sameer Mentioned, we need to enable external shuffle manager, right? Thanks, - Tsuyoshi On Sat, Dec 13, 2014 at 5:27 AM, Sameer Farooqui wrote: > Hi, > > FYI - There are no Worker JVMs used when Spark is launched under YARN. > Instead the NodeManager in YARN does