Hi guys.

A bit "edge" topic to this mailing list, but still.

I'd like to run Spark on Mesos in a way that mixes some jobs using the
dynamic allocation and others being "static" owners of their
resources.

I don't see any way Spark offers this natively.

The idea is turning off the shuffle service in part of the nodes to
give them for "static" tasks. I'm not sure that would work, could
someone please advise?

Or, if you see any other solution to this in a more general way - do share.

Thanks a lot,

Gregory


-- 

http://ror6ax.github.io/

Reply via email to