Hi Jamborta, You can use the following options in your application to limit the usage of resources, like
- spark.cores.max - spark.executor.memory Its better to use Mesos if you want to run multiple applications on the same cluster smoothly. Thanks Best Regards On Thu, Jun 26, 2014 at 5:37 PM, jamborta <[email protected]> wrote: > Hi all, > > not sure if this is a config issue or it's by design, but when I run the > spark shell, and try to submit another application from elsewhere, the > second application waits for the first to finish and outputs the following: > > Initial job has not accepted any resources; check your cluster UI to ensure > that workers are registered and have sufficient memory. > > I have four workers, each have some additional resources to take up the new > application. > > thanks, > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/running-multiple-applications-at-the-same-time-tp8333.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. >
