I have no idea, I'm not an Spark expert. As far I've understood Spark does not have an api to bind a task to a specific node.
The foreachPartition provides an approximation for that, but you can have more than one partition per node resulting in the problem I've described. Cheers, Paolo On Thu, Jul 7, 2016 at 9:44 PM, vkulichenko <[email protected]> wrote: > Paolo, Luis, > > Do you have any idea how to fix this? Is there a better way to execute a > function on every executor in Spark? > > -Val > > > > -- > View this message in context: > http://apache-ignite-users.70518.x6.nabble.com/Spark-Ignite-How-to-run-exactly-one-Ignite-worker-in-each-Spark-cluster-node-tp6137p6161.html > Sent from the Apache Ignite Users mailing list archive at Nabble.com. >
