I am having issues trying to setup spark to run jobs simultaneously. I thought I wanted FAIR scheduling?
I used the templated fairscheduler.xml as is when I start pyspark I see the 3 expected pools: production, test, and default when I login as second user and run pyspark I see the expected pools as that user as well when I open up a webbrowser to http://master:8080 I see my first user's state is running and my second user's state is waiting so I try putting them both in the production pool which is fair scheduler When I refresh http://master:8080 the second user's status is still waiting. If I try to run something as the second user I get "Initial job has not accepted any resources" Maybe fair queuing is not what I want? I'm starting pyspark as follows pyspark --master spark://master:7077 I started spark as follows start-all.sh