You should check out Cook from Two Sigma. It lets your fairly share you're cluster between several spark instances, scaling them up and down based on tenure and demand. Check it out at github.com/twosigma/cook On Mon, Jan 25, 2016 at 6:18 PM Charles Allen <[email protected]> wrote:
> Is there an allocator out there that will prioritize frameworks to favor > the oldest first? > > I'm using spark in coarse mode, and I actually want the earliest > registered course mode drivers to get as many resources as they want. >

