Hi to all. I'm looking for a nice way to analyze server traffic logs
stored in a central repository. Here's the deal: We have several
metric bucketloads of physical servers, each of which hosts a number of
VPS servers. Each VPS looks to its user, our customer, like a dedicated
physical server, of course, and as such has its own set of Apache,
sendmail, etc. logs. My job involves, in part, predictive analysis. I
often need to grab logs from a specific subset of accounts, analyze
their traffic, and design tests based on that data. To minimize impact
on customer accounts, I bring the logs to a central repository stored on
a FreeBSD server and use command line tools to analyze the logs.
I'd love to be able to use publicly-available tools to perform the
analysis, but most of what I see out there is designed for analysis and
monitoring on the production server itself. Is there anything that's
designed more for my situation? I'd love it if the tool were
intelligent enough to treat the log files in both a separate and
composite manner, but even if it can do something like generate a single
report for each VPS file set, I can then awk the separate reports into a
composite. The more command-line oriented the report, the better. Does
anyone have experience with this sort of thing? We'd prefer free, open
source stuff, but we'll definitely look at exceptional non-free software
as well. Many thanks--
Andrew Hunter
/*
PLUG: http://plug.org, #utah on irc.freenode.net
Unsubscribe: http://plug.org/mailman/options/plug
Don't fear the penguin.
*/