Hi,
so the same package in both jars is not the problem. Should have known that.
I do not know why this happens. Any ideas?
Regards,
Martin
On 22.09.2010 17:45, Martin Becker wrote:
Hi,
Tom, thanks for your answer.
OK, so the problem is, that when I add both hadoop-common-0.21.0.jar
AND hadoop-mapred-0.21.0.jar to the class path, the Tool interface is
marked deprecated by Eclipse. This seems odd. Only having
hadoop-common-0.21.0.jar in the class path works fine. Both jars
define classes in the package org.apache.hadoop.util where the Tool
interface is located. Could this be the problem? I am a little
clueless here and not sure whether this is a problem that should be
further addressed in this mailing list.
Thanks in advance,
Martin
On 22.09.2010 16:08, Tom White wrote:
Hi Martin,
Neither Tool nor ToolRunner is deprecated in 0.21.0. I don't think
they have ever been deprecated. You should be able to use them without
problems.
Tom
On Wed, Sep 22, 2010 at 6:52 AM, Martin Becker<[email protected]>
wrote:
Hello,
I am trying to move to Hadoop MapReduce 0.21.0.
The corresponding tutorial still uses Tool and ToolRunner.
Yet both are deprecated. What would be the correct way to implement,
configure and submit a Job now? I was thinking in terms of:
Configuration configuration = new Configuration(); Cluster
cluster = new Cluster(configuration); Job job =
Job.getInstance(cluster);
job.setJarByClass(WordCount.class);
job.setMapperClass(Map.class);
job.setCombinerClass(Reduce.class);
job.setReducerClass(Reduce.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);
FileInputFormat.addInputPath(job, new Path(INPUT));
FileOutputFormat.setOutputPath(job, new Path(OUTPUT));
System.exit(job.waitForCompletion(true) ? 0 : 1); Thanks in
advance,
Martin