Sorry, I should've included some stats with my email

I execute each job in the following manner

./bin/spark-submit --class CLASSNAME --master yarn-cluster --driver-memory
1g --executor-memory 1g --executor-cores 1 UBER.JAR
${ZK_PORT_2181_TCP_ADDR} my-consumer-group1 1


The box has

24 CPUs, Intel(R) Xeon(R) CPU E5-2420 v2 @ 2.20GHz

32 GB RAM


Thanks,

Josh

On Tue, Oct 28, 2014 at 4:15 PM, Soumya Simanta <soumya.sima...@gmail.com>
wrote:

> Try reducing the resources (cores and memory) of each application.
>
>
>
> > On Oct 28, 2014, at 7:05 PM, Josh J <joshjd...@gmail.com> wrote:
> >
> > Hi,
> >
> > How do I run multiple spark applications in parallel? I tried to run on
> yarn cluster, though the second application submitted does not run.
> >
> > Thanks,
> > Josh
>

Reply via email to