Did you read
https://spark.apache.org/docs/latest/job-scheduling.html#scheduling-within-an-application

Thanks
Best Regards

On Thu, Oct 15, 2015 at 11:31 PM, jeff.sadow...@gmail.com <
jeff.sadow...@gmail.com> wrote:

> I am having issues trying to setup spark to run jobs simultaneously.
>
> I thought I wanted FAIR scheduling?
>
> I used the templated fairscheduler.xml as is when I start pyspark I see the
> 3 expected pools:
> production, test, and default
>
> when I login as second user and run pyspark
> I see the expected pools as that user as well
>
> when I open up a webbrowser to http://master:8080
>
> I see my first user's state is running and my second user's state is
> waiting
>
> so I try putting them both in the production pool which is fair scheduler
>
> When I refresh http://master:8080
>
> the second user's status is still waiting.
>
> If I try to run something as the second user I get
>
> "Initial job has not accepted any resources"
>
> Maybe fair queuing is not what I want?
>
> I'm starting pyspark as follows
>
> pyspark --master spark://master:7077
>
> I started spark as follows
>
> start-all.sh
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/multiple-pyspark-instances-simultaneously-same-time-tp25079.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to