Hello!

I have a 4 node cluster of hadoop running. Now, there is 5th machine which
is acting as a  client of hadoop. It's not a part of the hadoop
cluster(master/slave config file). Now I have to writer a JAVA  code that
gets executed on this client which will simply put the client ystem's data
into HDFS(and get it replicated over 2 datanodes) and as per my requirement,
I can simply fetch it back on the client machine itself.

For this, I have done following things as of now::

***********************************
-> Among 4 nodes 2 are datanodes and ther oter 2 are namenode and jobtracker
respectively.
***********************************

***********************************
-> Now, to make that code work on client machine, I have designed a UI. Now
here on the client m/c, do i need to install hadoop?
***********************************

***********************************
-> I have installed hadoop on it, and in it's config file, I have specified
only 2 tags.
   1) fs.default.name-> value=namenode's address.
   2) dfs.http.address(namenode's addres)
***********************************

***********************************
Thus, If there is a file in /home/hadoop/test.java on client machine; I will
have 2 get the instance of HDFS fs by Filesystem.get. rt??
***********************************

***********************************
Then, by using Filesystem.util, I will have to simply specify both the
fs::local as src, hdfs as destination, and src path as the
/home/hadoop/test.java and destination as /user/hadoop/. rt??
So it should work ...!
***********************************

***********************************
-> But, it gives me an error as "not able to find src path
/home/hadoop/test.java"

-> Will i have to use RPC classes and methods under hadoop api to do this.??
***********************************

***********************************
 Things don;t seem to be working in any of the ways. Please help me out.
***********************************

Thanks!

Reply via email to