nd user I get
>>
>> "Initial job has not accepted any resources"
>>
>> Maybe fair queuing is not what I want?
>>
>> I'm starting pyspark as follows
>>
>> pyspark --master spark://master:7077
>>
>> I started spark as follo
master:7077
>
> I started spark as follows
>
> start-all.sh
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/multiple-pyspark-instances-simultaneously-same-time-tp25079.html
> Sent from the Apache Spa
I am having issues trying to setup spark to run jobs simultaneously.
I thought I wanted FAIR scheduling?
I used the templated fairscheduler.xml as is when I start pyspark I see the
3 expected pools:
production, test, and default
when I login as second user and run pyspark
I see the expected pool
.sh
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/multiple-pyspark-instances-simultaneously-same-time-tp25144.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/multiple-pyspark-instances-simultaneously-same-time-tp25079.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsu