Hi, This is a hard problem to solve atm if your requirement is that you really need Spark to operate in Coarse-grained mode. I assume this is a problem because you are trying to run two spark-applications (as apposed to two jobs in one applications).
Obvious “solutions” would be that you could run both applications in fine-grained mode. You could also try if both jobs can be submitted through the same spark context where its job scheduler would be set to FAIR (the default is FIFO.) However, I don’t have enough context information to know if this latter option would be applicable for you. If you need more help, please provide some context of what you’re trying to achieve. Regards, Hans > On Apr 13, 2016, at 6:53 PM, Andreas Tsarida <[email protected]> > wrote: > > > Hello, > > I’m trying to figure out a solution for dynamic resource allocation in mesos > within the same framework ( spark ). > > Scenario : > 1 - run spark a job in coarse mode > 2 - run second job in coarse mode > > Second job will not start unless first job finishes which is not something > that I would want. The problem is small when the job running doesn’t take too > long but when it does nobody can work on the cluster. > > Best scenario would be to have mesos revoke resources from the first job and > try to allocate resources to the second job. > > If there anybody else who solved this issue in another way ? > > Thanks

