Hi to all,
has anyone got an example/idea on how to make a logical distribution of part
of the files in the hadoop's nodes? What I'd like to understand if it's
feasible and reasonable  to
- Create custom chunks of a file (like, for department)
- Send them to the different data nodes
- And then, run a job on the different data nodes?
Thanks

Daniele
-- 
View this message in context: 
http://old.nabble.com/Logical-distribution-of-a-file-to-multiple-servers-tp31937681p31937681.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.

Reply via email to