Hi Anish,

Spark, like MapReduce, makes an effort to schedule tasks on the same nodes
and racks that the input blocks reside on.

-Sandy


On Tue, Jul 8, 2014 at 12:27 PM, anishs...@yahoo.co.in <
anishs...@yahoo.co.in> wrote:

> Hi All
>
> My apologies for very basic question, do we have full support of data
> locality in Spark MapReduce.
>
> Please suggest.
>
> --
> Anish Sneh
> "Experience is the best teacher."
> http://in.linkedin.com/in/anishsneh
>
>

Reply via email to