Hi,

I've got a question regarding hadoop configuration. Is it possible to pass
configuration parameters on job start up?
Something like that:

hadoop -HADOOP_HEAPSIZE=4G jar some.jar some.class.to.execute param1 param2

Or do I have to restart the hadoop cluster every time I want to change
something even if it is just for a specific job or workflow?
We have some jobs running which needs a lot of time and we want to start
another one with a slightly different configuration because it needs more
memory to finish.
We are using CDH3.

Greetings,
Mat

Reply via email to