Sam Tetherow wrote:

I am running about 10 gig/month in data and currently haven't deleted anything since I started in May. It looks like you can get about a 4.5:1 compression on the data using bzip2 though so I would be looking at a little over 2G/month compressed data

For the last four hours, I've got three million flows, summarizing 25 million packets and 13.4 gigs of stuff, sucking up about 160MB of disk space, which will add up to about a gig a day of disk space. While disk space is cheap, it ain't free. :)

Obviously, bzip2 won't help too much, because the data will either have to be post-processed (i.e. dump it into a database, which probably can't be compressed) or not (i.e. leave the files as-is for command line analysis). I think it really depends on just how long my boss would like historical data for :)

A bit of tinkering will, of course, be required.

My throughput on average is running about 15M and about 10M of that is torrents that I seed for various IPTV programs and linux distributions.

I'm not even counting all the office BitTorrent seeding. :)

(One of these years I need to upgrade my core router to RouterOS 2.9, partly for features like Netflow. The new Netflow collector only sees traffic to my wireless network, and doesn't see all the traffic to, say, my Web and mail servers, my torrent box, or my GBStv setup, which is probably another several MB/s I'd have to account for.)

David Smith
WISPA Wireless List:



Reply via email to