Fwd: multiple pyspark instances simultaneously (same time)

2015-10-22 Thread Jeff Sadowski
nd user I get >> >> "Initial job has not accepted any resources" >> >> Maybe fair queuing is not what I want? >> >> I'm starting pyspark as follows >> >> pyspark --master spark://master:7077 >> >> I started spark as follo

Re: multiple pyspark instances simultaneously (same time)

2015-10-22 Thread Akhil Das
master:7077 > > I started spark as follows > > start-all.sh > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/multiple-pyspark-instances-simultaneously-same-time-tp25079.html > Sent from the Apache Spa

multiple pyspark instances simultaneously (same time)

2015-10-20 Thread Jeff Sadowski
I am having issues trying to setup spark to run jobs simultaneously. I thought I wanted FAIR scheduling? I used the templated fairscheduler.xml as is when I start pyspark I see the 3 expected pools: production, test, and default when I login as second user and run pyspark I see the expected pool

multiple pyspark instances simultaneously (same time)

2015-10-20 Thread jeff.sadow...@gmail.com
.sh -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/multiple-pyspark-instances-simultaneously-same-time-tp25144.html Sent from the Apache Spark User List mailing list archive at Nabble.com.

multiple pyspark instances simultaneously (same time)

2015-10-15 Thread jeff.sadow...@gmail.com
this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/multiple-pyspark-instances-simultaneously-same-time-tp25079.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsu