HI Eric, Are you running an prod environment on Hadoop 1.0.3? If yes, then you have to upgrade to Hadoop1.2.0 or Hadoop2.2.0. If you don't want to change to other Hadoop version, you need to backport the patch to your code base ( I'm not sure the patch provided in HDFS-385 can be applied to Hadoop 1.0.3 smoothly, if not, you have to resolve the conflicts by yourself). The HDFS-3601 is not a good example for you, it's on hadoop2.x. You can read the implementation of this class: https://github.com/apache/hadoop-common/blob/release-1.2.0/src/hdfs/org/apache/hadoop/hdfs/server/namenode/BlockPlacementPolicyWithNodeGroup.java
--If I want to control the block placement then I have to write codes rather than type shell commands? if you want to implement your own logic on block placement, you have to write code. Regards, *Stanley Shi,* On Wed, Mar 19, 2014 at 3:07 AM, Eric Chiu <[email protected]> wrote: > > Hi Stanley, > > Thanks for your response, but I still have some problems. Could you gave > me further instructions? > I am now using hadoop 1.0.3. Does that mean I have to upgrade to 1.2.0? or > I can directly override the original code with what command? > > Another question is that you said I can refer to HDFS-3601 but what > version? I notice that there are 6 versions and v1,v2 modified the same > file and v3,v4,v5,v6 modified another file but v3,v4 shows some connection > and v5,v6 shows another connection. That confused me. And there is also a > file branch 2. It confused me again. > > The last problem is that how do I start reading the code to know what > policy one use? If I want to control the block placement then I have to > write codes rather than type shell commands? The codes are so big so that I > do not know where to start. Could you give me some hint? > > Since I am a new user, I am sorry if I asked stupid question. But I really > did not mean to. > > Thanks, > > Eric > > 2014-03-18 13:43 GMT+08:00 Stanley Shi <[email protected]>: > > This JIRA is included in Apache code since version > "0.21.0<https://issues.apache.org/jira/browse/HDFS/fixforversion/12314046> >> , 1.2.0<https://issues.apache.org/jira/browse/HDFS/fixforversion/12321657> >> , 1-win<https://issues.apache.org/jira/browse/HDFS/fixforversion/12320362> >> "; >> If you want to use it, you need to write your own policy, please see this >> JIRA for example: https://issues.apache.org/jira/browse/HDFS-3601 >> >> >> Regards, >> *Stanley Shi,* >> >> >> >> On Mon, Mar 17, 2014 at 11:31 AM, Eric Chiu <[email protected]>wrote: >> >>> HI all, >>> >>> Could anyone tell me How to install and use this hadoop plug-in? >>> >>> https://issues.apache.org/jira/browse/HDFS-385 >>> >>> I read the code but do not know where to install and use what command to >>> install them all. >>> >>> Another problem is that there are .txt and .patch files, which one >>> should be applied? >>> >>> Some of the .patch files have -win , does that mean that file is for >>> windows hadoop user? (I am using ubuntu) >>> >>> Thank you very much. >>> >> >> >
