On Jul 22, "Everton da Silva Marques" wrote:

> On Thu, Jul 21, 2005 at 03:44:58PM -0700, Mike Hunter wrote:
> > On Jul 21, "Alexey Lobanov" wrote:
> > 
> > > Does anyone know a way for additional optimization of raw netflow
> > > records, by merging all events during the *specified* period (i.e., 1
> > > hour) having same src, dst and ports? The aim is to save disk space not
> > > loosing important information regarding traffic details. Actually, same
> > > operation is done inside of cisco box - but the aggregation time is too
> > > small in most cases. And further optimisation in a dedicated
> > > high-performance computer seems to be quite feasible.
> > > 
> > > "flow-report" does not solve the problem because I need to have *raw*
> > > data for further analysis: scan detection, etc.
> > 
> > This is a very interesting idea!  Sadly, I don't know of any way to do it
> > :(
> 
> How about a quick-n-dirty custom-aggregator.pl script like:
> 
> flow-cat ... | flow-export -f2 | custom-aggregator.pl | \
>       flow-import -f2 -V5 | flow-send <to-flow-capture>
> 
> Would it work?

Yes, I think so.  Somebody should give it a try.  The question will be
whether it's too slow with the perl.

The script I wrote a while ago does some of this with hashes:

http://mailman.splintered.net/pipermail/flow-tools/2003-August/001503.html

But my script preserves less than is called for here.

Mike
_______________________________________________
Flow-tools mailing list
[EMAIL PROTECTED]
http://mailman.splintered.net/mailman/listinfo/flow-tools

Reply via email to