Hello Juhani,
Actually my intention is to distribute the events to different  exec source
and may be they can go to same or different sinks.
Some thing like this:

Logs-->LoadBalancer-->Execsource 1---------------------->AvroSink
                     |------------>Execsource 2----------------------->Same
Avro Sink as above
                     |------------->Execsource
3------------------------>HDFS sink

By this way if one the node running exec source is down the other two will
keep getting the events and there will be no interruption in the flow. Is
this possible?

Or is there any other way i can keep back up nodes so that as soon as it
goes down we can bring up that node?

Regards,
Som



On Wed, May 16, 2012 at 3:40 PM, Juhani Connolly <
[email protected]> wrote:

> What you're describing isn't possible
>
> However if you just want to send the results from one execsource  to
> multiple destinations, work is being done on a balancing sink, and using
> that would make more sense. If you absolutely need to have multiple exec
> sources, you'd have to put out the logs to separate files, since sources
> cannot communicate with each other and thus could not coordinate each
> others  positions.
>
> If you could describe what you are trying to do, maybe  we can help
>
>
> On 05/16/2012 06:13 PM, shekhar sharma wrote:
>
>> Hello all,
>> Is it possible to use some kind of load balancing across flume source?
>> For example:
>> I am using exec source to tail the output of the log file "usr/bin/tail
>> -f weblogs.txt". Is it possible that i can have several exec sources
>> running on the same log file and in between somekind of load balancer is
>> there which can send the logs to different exec source?
>>
>>
>> Regards,
>> Som
>>
>
>

Reply via email to