Yes. you can.  Here is code snippet from our project. 

final JobConf conf = new JobConf( c_JOB_SETTING.getCustomizeConf() );

        conf.setJar( c_JOB_SETTING.getJarName() );

        conf.setJobName( "MortgageValTestinput=" + c_JOB_SETTING
.getInputPath() + " output=" + c_JOB_SETTING.getOutputPath() );

        conf.setOutputKeyClass( Text.class );
        conf.setMapOutputKeyClass( Text.class );
        conf.setOutputValueClass( AttributeValueMap.class );

        conf.setMapperClass( MyMapper.class );
        conf.setCombinerClass( MyReducer.class );
        conf.setReducerClass( MyReducer.class );

        conf.setInputFormat(MyInputFormat.class );

        conf.setOutputFormat( SequenceFileOutputFormat.class );

        conf.setStrings( "io.serializations", 
"org.apache.hadoop.io.serializer.WritableSerialization",
                "org.apache.hadoop.io.serializer.JavaSerialization" );
        conf.set( "mapred.create.symlink", "yes" );
JobClient.runJob( conf )

Zhu, Guojun
Modeling Sr Graduate
571-3824370
guojun_...@freddiemac.com
Financial Engineering
Freddie Mac



   Félix López <jaaaelpum...@gmail.com> 
   06/27/2012 12:54 PM
   Please respond to
mapreduce-user@hadoop.apache.org


To
mapreduce-user@hadoop.apache.org
cc

Subject
Send jobs remotely






How can I send jobs remotely? I have a cluster running and I would like to 
 execute a mapreduce task from another machine (outside the cluster) and 
without to have to do this : bin/hadoop jar hadoop.jar main input output 
every time.
I don't know if this is possible in hadoop or maybe I have to program a 
service in the master that runs the job when it's called.

Maybe a good solution would be to run the job when a file is written in 
the hdfs, I don't if there are hooks in hadoop

Thanks 

Reply via email to