Hi All,

      I am trying to setup the hadoop in Highly available mode. So i download 
the source cocde from 
https://github.com/facebook/hadoop-20-warehouse/tree/master/src. When I try to 
make the build of these source code i got the following error. Why this happen 
? If anyone face this same one in your configuration , Please told how can you 
handle and rectify this error .

compile-mapred-classes:
    [javac] /home/test/ips/facebook-hadoop-20-warehouse-bbfed86/build.xml:392: 
warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds
    [javac] Compiling 304 source files to 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/build/classes
    [javac] 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/src/mapred/org/apache/hadoop/mapred/Task.java:1200:
 warning: [unchecked] unchecked cast
    [javac] found   : java.lang.Class<capture#419 of ? extends 
org.apache.hadoop.mapred.Reducer>
    [javac] required: java.lang.Class<? extends 
org.apache.hadoop.mapred.Reducer<K,V,K,V>>
    [javac]         (Class<? extends Reducer<K,V,K,V>>) 
job.getCombinerClass();
    [javac]                                                                 ^
    [javac] 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/src/mapred/org/apache/hadoop/mapred/Task.java:1202:
 warning: [unchecked] unchecked call to OldCombinerRunner(java.lang.Class<? 
extends 
org.apache.hadoop.mapred.Reducer<K,V,K,V>>,org.apache.hadoop.mapred.JobConf,org.apache.hadoop.mapred.Counters.Counter,org.apache.hadoop.mapred.Task.TaskReporter)
 as a member of the raw type org.apache.hadoop.mapred.Task.OldCombinerRunner
    [javac]         return new OldCombinerRunner(cls, job, inputCounter, 
reporter);
    [javac]                ^
    [javac] 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/src/mapred/org/apache/hadoop/mapred/Task.java:1202:
 warning: [unchecked] unchecked conversion
    [javac] found   : org.apache.hadoop.mapred.Task.OldCombinerRunner
    [javac] required: org.apache.hadoop.mapred.Task.CombinerRunner<K,V>
    [javac]         return new OldCombinerRunner(cls, job, inputCounter, 
reporter);
    [javac]                ^
    [javac] 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/src/mapred/org/apache/hadoop/mapred/Task.java:1209:
 warning: [unchecked] unchecked cast
    [javac] found   : java.lang.Class<capture#361 of ? extends 
org.apache.hadoop.mapreduce.Reducer<?,?,?,?>>
    [javac] required: java.lang.Class<? extends 
org.apache.hadoop.mapreduce.Reducer<K,V,K,V>>
    [javac]            taskContext.getCombinerClass();
    [javac]                                        ^
    [javac] 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/src/mapred/org/apache/hadoop/mapred/Task.java:1231:
 warning: [unchecked] unchecked cast
    [javac] found   : java.lang.Class<capture#478 of ?>
    [javac] required: java.lang.Class<K>
    [javac]       keyClass = (Class<K>) job.getMapOutputKeyClass();
    [javac]                                                     ^
    [javac] 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/src/mapred/org/apache/hadoop/mapred/Task.java:1232:
 warning: [unchecked] unchecked cast
    [javac] found   : java.lang.Class<capture#966 of ?>
    [javac] required: java.lang.Class<V>
    [javac]       valueClass = (Class<V>) job.getMapOutputValueClass();
    [javac]                                                         ^
    [javac] 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/src/mapred/org/apache/hadoop/mapred/Task.java:1233:
 warning: [unchecked] unchecked cast
    [javac] found   : org.apache.hadoop.io.RawComparator
    [javac] required: org.apache.hadoop.io.RawComparator<K>
    [javac]       comparator = (RawComparator<K>) 
job.getOutputKeyComparator();
    [javac]                                                                 ^
    [javac] 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/src/mapred/org/apache/hadoop/mapred/Task.java:1275:
 warning: [unchecked] unchecked conversion
    [javac] found   : java.lang.Class
    [javac] required: java.lang.Class<? extends 
org.apache.hadoop.mapreduce.Reducer<K,V,K,V>>
    [javac]       this.reducerClass = reducerClass;
    [javac]                           ^
    [javac] 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/src/mapred/org/apache/hadoop/mapred/Task.java:1277:
 warning: [unchecked] unchecked cast
    [javac] found   : java.lang.Class<capture#879 of ?>
    [javac] required: java.lang.Class<K>
    [javac]       keyClass = (Class<K>) context.getMapOutputKeyClass();
    [javac]                                                         ^
    [javac] 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/src/mapred/org/apache/hadoop/mapred/Task.java:1278:
 warning: [unchecked] unchecked cast
    [javac] found   : java.lang.Class<capture#640 of ?>
    [javac] required: java.lang.Class<V>
    [javac]       valueClass = (Class<V>) 
context.getMapOutputValueClass();
    [javac]                                                             ^
    [javac] 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/src/mapred/org/apache/hadoop/mapred/Task.java:1279:
 warning: [unchecked] unchecked cast
    [javac] found   : org.apache.hadoop.io.RawComparator<capture#669 of ?>
    [javac] required: org.apache.hadoop.io.RawComparator<K>
    [javac]       comparator = (RawComparator<K>) 
context.getSortComparator();
    [javac]                                                                ^
    [javac] 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/src/mapred/org/apache/hadoop/mapred/Task.java:1313:
 warning: [unchecked] unchecked call to 
OutputConverter(org.apache.hadoop.mapred.OutputCollector<K,V>) as a 
member of the raw type 
org.apache.hadoop.mapred.Task.NewCombinerRunner.OutputConverter
    [javac]                                                 new 
OutputConverter(collector),
    [javac]                                                 ^
    [javac] 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/src/mapred/org/apache/hadoop/mapred/Task.java:1313:
 warning: [unchecked] unchecked conversion
    [javac] found   : 
org.apache.hadoop.mapred.Task.NewCombinerRunner.OutputConverter
    [javac] required: org.apache.hadoop.mapreduce.RecordWriter<K,V>
    [javac]                                                 new 
OutputConverter(collector),
    [javac]                                                 ^
    [javac] 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/src/mapred/org/apache/hadoop/mapred/Task.java:1311:
 warning: [unchecked] unchecked method invocation: 
<INKEY,INVALUE,OUTKEY,OUTVALUE>createReduceContext(org.apache.hadoop.mapreduce.Reducer<INKEY,INVALUE,OUTKEY,OUTVALUE>,org.apache.hadoop.conf.Configuration,org.apache.hadoop.mapreduce.TaskAttemptID,org.apache.hadoop.mapred.RawKeyValueIterator,org.apache.hadoop.mapreduce.Counter,org.apache.hadoop.mapreduce.Counter,org.apache.hadoop.mapreduce.RecordWriter<OUTKEY,OUTVALUE>,org.apache.hadoop.mapreduce.OutputCommitter,org.apache.hadoop.mapreduce.StatusReporter,org.apache.hadoop.io.RawComparator<INKEY>,java.lang.Class<INKEY>,java.lang.Class<INVALUE>)
 in org.apache.hadoop.mapred.Task is applied to 
(org.apache.hadoop.mapreduce.Reducer<K,V,K,V>,org.apache.hadoop.mapred.JobConf,org.apache.hadoop.mapreduce.TaskAttemptID,org.apache.hadoop.mapred.RawKeyValueIterator,<nulltype>,org.apache.hadoop.mapred.Counters.Counter,org.apache.hadoop.mapred.Task.NewCombinerRunner.OutputConverter,org.apache.hadoop.mapreduce.OutputCommitter,org.apache.hadoop.mapred.Task.TaskReporter,org.apache.hadoop.io.RawComparator<K>,java.lang.Class<K>,java.lang.Class<V>)
    [javac]            reducerContext = createReduceContext(reducer, job, 
taskId,
    [javac]                                                ^
    [javac] 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/src/mapred/org/apache/hadoop/mapred/Task.java:1317:
 warning: [unchecked] unchecked conversion
    [javac] found   : org.apache.hadoop.mapreduce.Reducer.Context
    [javac] required: org.apache.hadoop.mapreduce.Reducer<K,V,K,V>.Context
    [javac]       reducer.run(reducerContext);
    [javac]                   ^
    [javac] 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/src/mapred/org/apache/hadoop/mapred/LocalJobRunner.java:348:
 warning: [unchecked] unchecked call to serialize(T) as a member of the raw 
type org.apache.hadoop.io.serializer.Serializer
    [javac]             serializer.serialize(splits.get(i));
    [javac]                                 ^
    [javac] 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/build/src/org/apache/hadoop/mapred/locality_jsp.java:79:
 warning: [unchecked] unchecked conversion
    [javac] found   : java.util.ArrayList
    [javac] required: 
java.util.Collection<org.apache.hadoop.mapred.JobInProgress>
    [javac]   Collection<JobInProgress> jobs = new ArrayList();
    [javac]                                    ^
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] 17 warnings
     [copy] Copying 3 files to 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/build/classes

compile-hdfs-classes:
    [javac] /home/test/ips/facebook-hadoop-20-warehouse-bbfed86/build.xml:428: 
warning: 'includeantruntime' was not set, defaulting to 
build.sysclasspath=last; set to false for repeatable builds
    [javac] Compiling 156 source files to 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/build/classes
    [javac] 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/src/hdfs/org/apache/hadoop/hdfs/DFSClient.java:1215:
 reference to CorruptFileBlocks is ambiguous, both class 
org.apache.hadoop.hdfs.protocol.CorruptFileBlocks in 
org.apache.hadoop.hdfs.protocol and class 
org.apache.hadoop.fs.CorruptFileBlocks in org.apache.hadoop.fs match
    [javac]   public CorruptFileBlocks listCorruptFileBlocks(String path,
    [javac]          ^
    [javac] 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/src/hdfs/org/apache/hadoop/hdfs/DFSClient.java:1225:
 reference to CorruptFileBlocks is ambiguous, both class 
org.apache.hadoop.hdfs.protocol.CorruptFileBlocks in 
org.apache.hadoop.hdfs.protocol and class 
org.apache.hadoop.fs.CorruptFileBlocks in org.apache.hadoop.fs match
    [javac]   private CorruptFileBlocks 
versionBasedListCorruptFileBlocks(String path,
    [javac]           ^
    [javac] 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/src/hdfs/org/apache/hadoop/hdfs/DFSClient.java:1246:
 reference to CorruptFileBlocks is ambiguous, both class 
org.apache.hadoop.hdfs.protocol.CorruptFileBlocks in 
org.apache.hadoop.hdfs.protocol and class 
org.apache.hadoop.fs.CorruptFileBlocks in org.apache.hadoop.fs match
    [javac]   private CorruptFileBlocks methodBasedListCorruptFileBlocks(String 
path,
    [javac]           ^
    [javac] 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/src/hdfs/org/apache/hadoop/hdfs/tools/JMXGet.java:45:
 warning: sun.management.ConnectorAddressLink is Sun proprietary API and may be 
removed in a future release
    [javac] import sun.management.ConnectorAddressLink;
    [javac]                      ^
    [javac] 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/src/hdfs/org/apache/hadoop/hdfs/DFSClient.java:1231:
 reference to CorruptFileBlocks is ambiguous, both class 
org.apache.hadoop.hdfs.protocol.CorruptFileBlocks in 
org.apache.hadoop.hdfs.protocol and class 
org.apache.hadoop.fs.CorruptFileBlocks in org.apache.hadoop.fs match
    [javac]         return new CorruptFileBlocks(new String[0], "");
    [javac]                    ^
    [javac] 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/src/hdfs/org/apache/hadoop/hdfs/DFSClient.java:1240:
 reference to CorruptFileBlocks is ambiguous, both class 
org.apache.hadoop.hdfs.protocol.CorruptFileBlocks in 
org.apache.hadoop.hdfs.protocol and class 
org.apache.hadoop.fs.CorruptFileBlocks in org.apache.hadoop.fs match
    [javac]       return new CorruptFileBlocks(str.toArray(new 
String[str.size()]), "");
    [javac]                  ^
    [javac] 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/src/hdfs/org/apache/hadoop/hdfs/DFSClient.java:1253:
 reference to CorruptFileBlocks is ambiguous, both class 
org.apache.hadoop.hdfs.protocol.CorruptFileBlocks in 
org.apache.hadoop.hdfs.protocol and class 
org.apache.hadoop.fs.CorruptFileBlocks in org.apache.hadoop.fs match
    [javac]         return new CorruptFileBlocks(new String[0], "");
    [javac]                    ^
    [javac] 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/src/hdfs/org/apache/hadoop/hdfs/DFSClient.java:1262:
 reference to CorruptFileBlocks is ambiguous, both class 
org.apache.hadoop.hdfs.protocol.CorruptFileBlocks in 
org.apache.hadoop.hdfs.protocol and class 
org.apache.hadoop.fs.CorruptFileBlocks in org.apache.hadoop.fs match
    [javac]       return new CorruptFileBlocks(str.toArray(new 
String[str.size()]), "");
    [javac]                  ^
    [javac] 
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/src/hdfs/org/apache/hadoop/hdfs/tools/JMXGet.java:148:
 warning: sun.management.ConnectorAddressLink is Sun proprietary API and may be 
removed in a future release
    [javac]       url_string = 
ConnectorAddressLink.importFrom(Integer.parseInt(localVMPid));
    [javac]                    ^
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] 7 errors
    [javac] 2 warnings

BUILD FAILED
/home/test/ips/facebook-hadoop-20-warehouse-bbfed86/build.xml:428: Compile 
failed; see the compiler error output for details.

Total time: 4 minutes 17 seconds


Thanks in Advance.........

Regards,

Shanmuganathan


Reply via email to