[EMAIL PROTECTED] wrote on 02/28/2008 03:01:21 AM:

> We have `flow-capture' running for (looking at the `top' output) 
> about 300 hours and pretty many flow. We used the following command:
> /usr/bin/flow-cat -t "2008/02/13 14:00:00" -T "2008/02/13 14:29:59" 
> /usr/local/sflow/ft | customParser
> to parse traffic. But the time it takes to complete the above 
> command had been constantly increasing.

What sort of directory structure do you have beneath /usr/local/sflow/ft? 
N=3 type? (i.e., by year, month, day):

/usr/local/sflow/ft/2008/2008-02/2008-02-27

If it is like this, you can limit your comand to just a day by specifying 
where to look:

flow-cat -t "02/27/2008 23:44:59" -T "02/28/2008 21:30:00" 
/usr/local/sflow/ft/2008/2008-02/2008-02-28

You might also be interested in looking at FlowViewer, a web interface for 
easy analysis, graphing, and tracking of flow-tools data. FlowViewer 
limits the the extent of the flow-cat 'examination' automatically.

http://ensight.eos.nasa.gov/FlowViewer

> We have copied half-hour traffic from /usr/local/sflow/ft to another
> directory and have run the same command for that another directory. 
> The taken time was small as expected. So it appeared `flow-cat' 
> reads all files in all subdirs of /usr/local/sflow/ft in this case 
> which constantly increases the time to parse traffic. Why is that 
> so? Isn't it sane to only read the necessary files?

Yes - I've been surprised that flow-cat appears to take longer than one 
would think would be necessary to do what it does. In a multi-step 
command, flow-cat seems to take the longest time by far.

Does anyone know why this is, and if it could be 'tweaked'?

Thanks,

Joe

_______________________________________________
Flow-tools mailing list
[EMAIL PROTECTED]
http://mailman.splintered.net/mailman/listinfo/flow-tools

Reply via email to