Hello!

I am have following queries related to Hadoop::

-> Once I place my data in HDFS, it gets replicated and chunked
automatically over the datanodes. Right? Hadoop takes care of all those
things.

-> Now, if there is some third party who is not participating in the Hadoop
program. Means, he is not one of the nodes of hadoop cluster. Now, he has
some data on his local filesystem. Thus, can I place this data into HDFS?
How?

-> Then, now when, that third party asks for  a file or a direcory or any
kind of data that was previously being dumped in HDFS without that third
person's knowledge- he wnats it back(wants to retrieve it). Thus, the data
should get placed on his local file system again, in some specific
directory. How can I do this?

-> Will I have to use Map-Reduce or something else ot make it work.

-> Also, if I write map reduce code for all the complete activity, how will
I fetch the data or the files that are chunked in HDFS in the form of blocks
and combine(reassemble) them into a complete file and place it on a node;s
local filesystem who is not a part of hadoop cluster setup.

Eagerly waiting for reply!

Thanking You,
Sugandha!



-- 
Regards!
Sugandha

Reply via email to