Dear all

I want to setup spark in cluster mode. The problem is that each worker node
is looking for a file to process.....in its local directory.....is it
possible to setup some thing hdfs so that each worker node take  its part
of a file from hdfs....any good tutorials for this?

Thanks

Reply via email to