On Mon, Aug 20, 2001 at 04:44:48PM -0700, Tim Howe wrote:
> I was running tcpdump to capture network traffic in a file for about 20
> seconds as a test.  There was a LOT of traffic and this file ended up being
> about 1.2meg in size...  At that rate, if I ran tcpdump for 1.5 hours it
> would fill that file system...  So what I'm wondering is this: If I pipe
> tcpdump's output to a perl program that keeps a cumulative log that doesn't
> get very big, will it be able to run fast enough on this 166mhz machine to
> keep up with the packet data?  How fast can perl run?  How fast can it parse
> data?  Am I looking at this all wrong?
> 

Do you need perl?  Could you do it with cut, grep and some shell code?
Probably would be faster than perl.  

OpenBSD ports you might want to look at (I haven't used any of them for
anything serious, just some for kicks :)

net/iplog       "TCP/IP traffic logging tool"
net/tcpstat     "report network interface statistics"
net/tcpshow     "decodes tcpdump output"
net/tcpslice    "tool for extracting and gluing pcap (tcpdump) files"
net/trafd       "the BPF traffic collector"
net/trafshow    "full screen visualization of network traffic"
net/ettercap    "multi-purpose sniffer/interceptor/logger"
net/ngrep       "network grep"
net/ntop        "network usage, interface similar to top"
net/angst       "active packet sniffer"
net/ipfm        "IP bandwidth analysis tool"
net/mtr         "Matt's traceroute - network diagnostic tool"
net/oproute     "network performance measuring tool"

I heard mention of a tool called ipacct on a mailing list somewhere,
from the name and what I heard, you may want to look at that also.

-- 
<[EMAIL PROTECTED]>
<[EMAIL PROTECTED]>
<[EMAIL PROTECTED]>

Reply via email to