Hi,
Some time back I have written an article on how to setup development
environment for Hadoop and also how to debug.
http://www.codeproject.com/Articles/1067129/Debugging-Hadoop-HDFS-using-IntelliJ-IDEA-on-Linux
You can reach me for any issues.
Thanks
Mallan
On Nov 15, 2016 5:49 PM, "Brahma
Following links might be useful fro you.
https://wiki.apache.org/hadoop/EclipseEnvironment
http://blog.cloudera.com/blog/2013/05/how-to-configure-eclipse-for-hadoop-contributions/
https://www.quora.com/What-are-the-best-ways-to-learn-about-Hadoop-source
Regards
Brahma Reddy Battula
From:
Thanks Brahma. That certainly cleared up a lot of doubts - the file did
indeed show up in *fsck -openforwrite *and deleting it made the node move
to "decommissioned" state.
So the recommendation here is to wait for all files having blocks on the
node to be closed before adding it to the excludes
(Keeping user-mailing list in loop.)
You can compile corresponding module which you modified.
Please refer "Where to run Maven from?" from the following.
https://github.com/apache/hadoop/blob/trunk/BUILDING.txt
Regards
Brahma Reddy Battula
-Original Message-
From: Madhvaraj Shetty
Please check my inline comments to your queries. Hope I have answered all your
questions…
Regards
Brahma Reddy Battula
From: Hariharan [mailto:hariharan...@gmail.com]
Sent: 15 November 2016 18:55
To: user@hadoop.apache.org
Subject: HDFS - Corrupt replicas preventing decommissioning?
Hello
Hi Ajay
For the running containers, you can get container report from
ResourceManager. For completed/killed containers, you need start
ApplicationHistoryServer daemon and use the same API i.e
yarnClient.getContainerReport() to get container report. Basically, this
API contact RM first for