Re: Spark job uses only one Worker

2016-01-08 Thread Michael Pisula
your master in the Hue. > > Thanks > > > On Thursday, January 7, 2016 5:03 PM, Michael Pisula > <michael.pis...@tngtech.com> wrote: > > > I had tried several parameters, including --total-executor-cores, no > effect. > As for the port, I tried 7077, but

Re: Spark job uses only one Worker

2016-01-07 Thread Michael Pisula
, Igor Berman wrote: > share how you submit your job > what cluster(yarn, standalone) > > On 7 January 2016 at 23:24, Michael Pisula <michael.pis...@tngtech.com > <mailto:michael.pis...@tngtech.com>> wrote: > > Hi there, > > I ran a simple Batc

Re: Spark job uses only one Worker

2016-01-07 Thread Michael Pisula
rk/bin/spark-submit --class demo.spark.StaticDataAnalysis > --master spark://:6066 --deploy-mode cluster > demo/Demo-1.0-SNAPSHOT-all.jar > > Cheers, > Michael > > > On 07.01.2016 22:41, Igor Berman wrote: >> share how you submit your job >

Re: Spark job uses only one Worker

2016-01-07 Thread Michael Pisula
ng port it's very strange, please post what is problem > connecting to 7077 > > use *--total-executor-cores 4 in your submit* > * > * > if you can post master ui screen after you submitted your app > > > On 8 January 2016 at 00:02, Michael Pisula <michael.pis...@t

Spark job uses only one Worker

2016-01-07 Thread Michael Pisula
Hi there, I ran a simple Batch Application on a Spark Cluster on EC2. Despite having 3 Worker Nodes, I could not get the application processed on more than one node, regardless if I submitted the Application in Cluster or Client mode. I also tried manually increasing the number of partitions in