Hi, This may be a silly question, but is there a way via java to start a job remotely?
What I mean is, I write a map/reduce (with the jobclient), jar it up, and deploy it to my cluster. I know I can log into the master node and start the job on the command line, but I want to kick it off from java. A use case would be: Some non-hadoop java app runs, collects some data and saves it in a new folder on the hadoop cluster. Once the save is complete, the app should tell hadoop to run a job that's already on the cluster passing it the path to the new folder and exit. The job in the jar uses a jobclient that chains together more jobs. Thanks, - Jonathan
