Hi

Is it currently possible to define in Spark that some worker node should be
preferred to the other worker nodes? That is, in a heterogeneous computing
environment some computing units can be more powerful than the others and
assigning computing jobs to them should be prioritized.

Best regards,
Markus Losoi ([email protected])

Reply via email to