Re: [Spark on YARN] Multiple Auxiliary Shuffle Service Versions

2015-10-06 Thread Andreas Fritzler
Engineering * >> *o:* 646.759.0052 >> >> * <http://www.magnetic.com/>* >> >> On Mon, Oct 5, 2015 at 11:06 AM, Andreas Fritzler < >> andreas.fritz...@gmail.com> wrote: >> >>> Hi Steve, Alex, >>> >>> how do you handle the distri

Re: [Spark on YARN] Multiple Auxiliary Shuffle Service Versions

2015-10-05 Thread Andreas Fritzler
Hi Steve, Alex, how do you handle the distribution and configuration of the spark-*-yarn-shuffle.jar on your NodeManagers if you want to use 2 different Spark versions? Regards, Andreas On Mon, Oct 5, 2015 at 4:54 PM, Steve Loughran wrote: > > > On 5 Oct 2015, at 16:48, Alex Rovner wrote: > >

[Spark on YARN] Multiple Auxiliary Shuffle Service Versions

2015-10-05 Thread Andreas Fritzler
Hi, I was just wondering, if it is possible to register multiple versions of the aux-services with YARN as described in the documentation: 1. In the yarn-site.xml on each node, add spark_shuffle to yarn.nodemanager.aux-services, then set yarn.nodemanager.aux-services.spark_shuffle.clas

Re: Programmatically create SparkContext on YARN

2015-08-19 Thread Andreas Fritzler
--name "My app name" > --jars lib1.jar,lib2.jar > --deploy-mode cluster > app.jar > > Both YARN and standalone modes support client and cluster modes, and the > spark-submit script is the common interface through which you can launch > your application. In ot

Programmatically create SparkContext on YARN

2015-08-17 Thread Andreas Fritzler
Hi all, when runnig the Spark cluster in standalone mode I am able to create the Spark context from Java via the following code snippet: SparkConf conf = new SparkConf() >.setAppName("MySparkApp") >.setMaster("spark://SPARK_MASTER:7077") >.setJars(jars); > JavaSparkContext sc = new Ja