Hello Bhaskar,

      That's great..And the best approach to stream logs depends upon
the type of source you want to watch for..And by looking at your
usecase, I would suggest to go for "multi-hop" flows where events
travel through multiple agents before reaching the final destination.

Regards,
    Mohammad Tariq


On Thu, Jun 14, 2012 at 10:48 PM, Bhaskar <bmar...@gmail.com> wrote:
> I know what i am missing:-)  I missed connecting the sink with the channel.
>  My small POC works now and i am able to view the streamed logs.  Thank you
> all for the guidance and patience in answering all questions.  So, whats the
> best approach to stream logs from other hosts?  Basically my next task would
> be to set up collector (sort of) model to stream logs to intermediary and
> then stream from collector to a sink location.  I'd appreciate any
> thoughts/guidance in this regard.
>
> Bhaskar
>
>
> On Thu, Jun 14, 2012 at 12:52 PM, Bhaskar <bmar...@gmail.com> wrote:
>>
>> For testing purposes, I tried with the following configuration without
>> much luck.  I see that the process started fine but it just does not write
>> anything to the sink.  I guess i am missing something here.  Can one of you
>> gurus take a look and suggest what i am doing wrong?
>>
>> Thanks,
>> Bhaskar
>>
>> agent1.sources = tail
>> agent1.channels = MemoryChannel-2
>> agent1.sinks = svc_0_sink
>>
>>
>> agent1.sources.tail.type = exec
>> agent1.sources.tail.command = tail -f /var/log/access.log
>> agent1.sources.tail.channels = MemoryChannel-2
>>
>> agent1.sinks.svc_0_sink.type = FILE_ROLL
>> agent1.sinks.svc_0_sink.sink.directory=/flume_runtime/logs
>> agent1.sinks.svc_0_sink.rollInterval=0
>>
>> agent1.channels.MemoryChannel-2.type = memory
>>
>>
>> On Thu, Jun 14, 2012 at 4:26 AM, Guillaume Polaert <gpola...@cyres.fr>
>> wrote:
>>>
>>> Hi Bhaskar,
>>>
>>> This is the flume.conf (http://pastebin.com/WULgUuaf) what I'm using.
>>> I have an avro server on the hadoop-m host and one agent per node (slave
>>> hosts). Each agent send the ouput of a exec command to avro server.
>>>
>>> Host1 : exec -> memory -> avro (sink)
>>>
>>> Host2 : exec -> memory -> avro
>>>                                                >>>>>    MainHost : avro
>>> (source) -> memory -> rolling file (local FS)
>>> ...
>>>
>>> Host3 : exec -> memory -> avro
>>>
>>>
>>> Use your own exec command to read Apache log.
>>>
>>> Guillaume Polaert | Cyrès Conseil
>>>
>>> De : Bhaskar [mailto:bmar...@gmail.com]
>>> Envoyé : mercredi 13 juin 2012 19:16
>>> À : flume-user@incubator.apache.org
>>> Objet : Newbee question about flume 1.2 set up
>>>
>>> Good Afternoon,
>>> I am a newbee to flume and read thru limited documentation available.  I
>>> would like to set up the following to test out.
>>>
>>> 1.  Read apache access logs (as source)
>>> 2.  Use memory channel
>>> 3.  Write it to a NFS (or even local) file system
>>>
>>> Can some one help me with the necessary configuration.  I am having
>>> difficult time to glean that information from available documentation.  I am
>>> sure someone has done such test before and i appreciate if you can pass on
>>> that information.  Secondly, I also would like to stream the logs to a
>>> remote server.  Is that a log4j configuration or do i need to run an agent
>>> on each host to do so?  Any configuration examples would be of great help.
>>>
>>> Thanks,
>>> Bhaskar
>>
>>
>

Reply via email to