Mike Odegard said:
> I need help creating a script to pull out information of a log file.
> This file is on our System Log server.
>
> After I grep to a subset of the file for some tcp/ip protocol, such as
> smtp, ftp, etc., I need to run a script on the subset file to extract
> all the IP addresses, creating a list, dropping duplicates, but then
> showing the line count for each matching IP address.

Instead of scripting this part, why not use syslog-ng and seperate
into unique log files for each service?

> For example, each record has 'src=xxx.xxx.xxx.xxx' with the IP address
> of the requesting machine.
> I want to pull out all of the IP addresses on the 'src=' field,
> dropping
> duplicates.
> Then count how many lines for each IP address found in the previous
> step, such as found with 'wc -l'.
>
> 10.43.223.44   143
> 10.67.11.329   2402
> 10.11.5.208     8
>
> etc.
>
> Can this be done with a bash script?  Or is Perl needed?
> Or other tools that can be run from a script file?

Before I 'reinvented the wheel' I'd go looking on sourceforge and
freshmeat. My guess is there's a ton of log analyzers already written.
Find one that best matches your needs, then customise the rest.


-- 
Neil Schneider                              pacneil_at_linuxgeek_dot_net
                                           http://www.paccomp.com
Key fingerprint = 67F0 E493 FCC0 0A8C 769B  8209 32D7 1DB1 8460 C47D
Secrecy, being an instrument of conspiracy, ought never to be the
system of a regular government.
- Jeremy Bentham, jurist and philosopher (1748-1832)


-- 
[email protected]
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-list

Reply via email to