How can I send jobs remotely? I have a cluster running and I would like to execute a mapreduce task from another machine (outside the cluster) and without to have to do this : bin/hadoop jar hadoop.jar main input output every time. I don't know if this is possible in hadoop or maybe I have to program a service in the master that runs the job when it's called.
Maybe a good solution would be to run the job when a file is written in the hdfs, I don't if there are hooks in hadoop Thanks