Have a look at the dynamic resource allocation listed here https://spark.apache.org/docs/latest/job-scheduling.html
Thanks Best Regards On Thu, Oct 22, 2015 at 11:50 PM, Suman Somasundar < suman.somasun...@oracle.com> wrote: > Hi all, > > > > Is there a way to run 2 spark applications in parallel under Yarn in the > same cluster? > > > > Currently, if I submit 2 applications, one of them waits till the other > one is completed. > > > > I want both of them to start and run at the same time. > > > > Thanks, > Suman. >