Dear all,

I want to know if their is any class or any way to access the file list and
meta data from a remote HDFS namenode.
for example, there are two hadoop instances, which mean two namenodes (nn1
and nn2)
If I was super user in both two hadoop instances,
and now I am in nn1, want to get nn2's file list and meta data
Is there any way to get that?

Now I could just try the most traditional way, which is
si...@nn1:~$ ssh nn2 "/hadoop/bin/hadoop fs -lsr "
and then parse the result to get each file's meta data
Is there any class or api I could use with?

My dream way is to make my own jar in nn1, through bin/hadoop jar remote.jar
remote
I can get certain directory inform of remote hdfs

thanks a lot

Best Regards,
Simon

Reply via email to