Hi,

I have been getting nfdump up and running to produce some daily usage stats.  
So far all is working well with filters and aggregations but I'm hitting a 
problem with what must be the simplest part.  Sorry if this has been covered 
before, but going through the list archives I didn't find an answer.

Let's say I have two routers saving to the following directory structure:

/var/netflow/Router-1/year/month/day/hour/
/var/netflow/Router-2/year/month/day/hour/

( being produced by the nfcapd command:  nfcapd -p 9995 -n 
Router-1,192.168.0.1,/var/netflow/Router-1 -n 
Router-2,192.168.1.1,/var/netflow/Router-2 -S2 )


I have a perl script that I would like to run daily just after midnight which 
looks at the day priors usage, but I'm hitting a problem with how to specify 
the directory nfdump should use.

nfdump -R /var/netflow  or nfdump -R . -M /var/netflow/Router-1:Router-2 works 
just fine if I want to look at the whole subdirectory structure

I have tried:
- nfdump -R . -M /var/netflow/Router-1 -M /var/netflow/Router-2 but it only 
uses the Router-2 directory
- nfdump -R 2010/06/22 -M /var/netflow/Router-1:Router-2  understandably gives 
an error 'Not a file'
- nfdump -R . -M /var/netflow/Router-1:Router-2/2010/06/21 is accepted it 
appears to be processing all directories under the router specified.

While if there is no other solution I can change my script to process each 
router separately, I was wondering if there was a solution with the available 
command line options?



Thanks,

Matt


------------------------------------------------------------------------------
ThinkGeek and WIRED's GeekDad team up for the Ultimate 
GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the 
lucky parental unit.  See the prize list and enter to win: 
http://p.sf.net/sfu/thinkgeek-promo
_______________________________________________
Nfdump-discuss mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/nfdump-discuss

Reply via email to