Thanks for the prompt reply and I'm sorry I forgot to include the exception. My
bad. I've included it below. There certainly appears to be a server running on
localhost:9001. At least, I was able to telnet to that address. While in
development, I'm treating the server on localhost as the remote server. Moving
to production, there'd obviously be a different remote server address
configured.
Root Exception stack trace:
java.io.EOFException
at java.io.DataInputStream.readInt(DataInputStream.java:375)
at
org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
+ 3 more (set debug level logging or '-Dmule.verbose.exceptions=true' for
everything)
********************************************************************************
On Feb 12, 2013, at 4:22 PM, Nitin Pawar <[email protected]> wrote:
> conf.set("mapred.job.tracker", "localhost:9001");
>
> this means that your jobtracker is on port 9001 on localhost
>
> if you change it to the remote host and thats the port its running on then it
> should work as expected
>
> whats the exception you are getting?
>
>
> On Wed, Feb 13, 2013 at 2:41 AM, Alex Thieme <[email protected]> wrote:
> I apologize for asking what seems to be such a basic question, but I would
> use some help with submitting a job to a remote server.
>
> I have downloaded and installed hadoop locally in pseudo-distributed mode. I
> have written some Java code to submit a job.
>
> Here's the org.apache.hadoop.util.Tool and org.apache.hadoop.mapreduce.Mapper
> I've written.
>
> If I enable the conf.set("mapred.job.tracker", "localhost:9001") line, then I
> get the exception included below.
>
> If that line is disabled, then the job is completed. However, in reviewing
> the hadoop server administration page (http://localhost:50030/jobtracker.jsp)
> I don't see the job as processed by the server. Instead, I wonder if my Java
> code is simply running the necessary mapper Java code, bypassing the locally
> installed server.
>
> Thanks in advance.
>
> Alex
>
> public class OfflineDataTool extends Configured implements Tool {
>
> public int run(final String[] args) throws Exception {
> final Configuration conf = getConf();
> //conf.set("mapred.job.tracker", "localhost:9001");
>
> final Job job = new Job(conf);
> job.setJarByClass(getClass());
> job.setJobName(getClass().getName());
>
> job.setMapperClass(OfflineDataMapper.class);
>
> job.setInputFormatClass(TextInputFormat.class);
>
> job.setMapOutputKeyClass(Text.class);
> job.setMapOutputValueClass(Text.class);
>
> job.setOutputKeyClass(Text.class);
> job.setOutputValueClass(Text.class);
>
> FileInputFormat.addInputPath(job, new
> org.apache.hadoop.fs.Path(args[0]));
>
> final org.apache.hadoop.fs.Path output = new
> org.apache.hadoop.fs.Path(args[1]);
> FileSystem.get(conf).delete(output, true);
> FileOutputFormat.setOutputPath(job, output);
>
> return job.waitForCompletion(true) ? 0 : 1;
> }
>
> public static void main(final String[] args) {
> try {
> int result = ToolRunner.run(new Configuration(), new
> OfflineDataTool(), new String[]{"offline/input", "offline/output"});
> log.error("result = {}", result);
> } catch (final Exception e) {
> throw new RuntimeException(e);
> }
> }
> }
>
> public class OfflineDataMapper extends Mapper<LongWritable, Text, Text, Text>
> {
>
> public OfflineDataMapper() {
> super();
> }
>
> @Override
> protected void map(final LongWritable key, final Text value, final
> Context context) throws IOException, InterruptedException {
> final String inputString = value.toString();
> OfflineDataMapper.log.error("inputString = {}", inputString);
> }
> }
>
>
>
>
> --
> Nitin Pawar