Hey,
yes it is possible. I'm doing it exactly this way for my implementation
from a remote client.
Implement it like this:
Configuration conf = new Configuration();
conf.set("hadoop.job.ugi", "user, group");
conf.set("namenode.host", somehost.somedomain);
conf.set("jobtracker.host", somehost.somedomain);
conf.set("mapred.job.tracker", somehost.somedomain:someport);
conf.set("fs.default.name", hdfs://somehost.somedomain:someport);
Job job = new Job(conf, jobname);
job.setJarByClass(...);
...
Cheers
Michael
On 07/28/2010 03:36 PM, Sebastian Ruff (open4business GmbH) wrote:
Hey,
is it possible to start a job on a hadoop cluster from remote? For
example we have a web application
which runs on an apache tomcat server. And would like to start a
mapreduce job on our cluster, from
within the webapp.
Is this possible? And if yes, what are the steps to get there? Do I just
have to put my namenode and datanode
in a core-site.xml in the webapp and call the api?
Thanks a lot,
Sebastian