The answer should be yes, each process should be able to run on different hosts. We are currently doing this.
Host A submits kafka hadoop job to the job tracker on Host B, Host B then then connects to Host C (or many host C's) I planned on having a look at the example again to see if there are steps there are missing, or if the examples need to be beefed up. Thanks, -Richard On Tue, Aug 30, 2011 at 11:39 AM, Ben Ciceron <b...@triggit.com> wrote: > let me rephrase this: > > can any of the kafka process run outside the hadoop cluster as long as > it can connect to the hadoop process from that host ? > e.g : > > hostA (NOT in th hadoop cluster) : runs kafka hadoop consumer > hostB (in th hadoop cluster) : runs jobtracker > > > Cheers, > Ben- > > > > > On Mon, Aug 29, 2011 at 4:59 PM, Jun Rao <jun...@gmail.com> wrote: > > My understanding is that it's not tied to localhost. You just need to > change > > the jobtracker setting in you Hadoop config. > > > > Thanks, > > > > Jun > > > > On Thu, Aug 25, 2011 at 4:31 PM, Ben Ciceron <b...@triggit.com> wrote: > > > >> Hello, > >> > >> does kafka hadoop consumer expect the jobtracker to run locally only ? > >> it seems it expect it locally (localhost/127.0.0.1:9001) . > >> Is it a requirement or there is a way to change it to a remote uri ? > >> > >> Cheers, > >> Ben- > >> > > >