I was running tcpdump to capture network traffic in a file for about 20
seconds as a test.  There was a LOT of traffic and this file ended up being
about 1.2meg in size...  At that rate, if I ran tcpdump for 1.5 hours it
would fill that file system...  So what I'm wondering is this: If I pipe
tcpdump's output to a perl program that keeps a cumulative log that doesn't
get very big, will it be able to run fast enough on this 166mhz machine to
keep up with the packet data?  How fast can perl run?  How fast can it parse
data?  Am I looking at this all wrong?

Tim

Reply via email to