Hi,

It really depends on what are your needs.

I don't know much about the fancy things from splunk, but you can do some
cool things with logstash.

There is also a nice ui for logstash, kibana (
https://github.com/rashidkpc/Kibana)


On 9 May 2012 17:04, Mark Walkom <[email protected]> wrote:

> We're looking at 2G a day, which is AUS$30K a year. And were on the S end
> of SME so it's a hell of a lot.
>
> The only way we could cut this amount down would be is if we wrote a
> customer parser that read the application logs that cut out all the
> replicated crap (mostly environment variable stuff) and spat out the logs
> to a separate dir for splunk to read.
> But then we need to deal with extra storage requirements.
>
> Again, when you are a small operation with a small budget, money rules.
> --
> SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
> Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html
>
-- 
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html

Reply via email to