Hi to all!

 

I'm trying to do some network analysis in a university department network
and I choose to use NTOP to acquire statistic data. 10 hours later NTOP
crashed due to lack of memory (only 256MB were available in the NTOP
machine).

 

I spent a 2 or 3 hours reading some references, trying to understand the
memory limitations of NTOP, and, if I understood well, its is kind of
difficult to do a long run analysis (e.g. 1 week or +) with NTOP when
dealing with medium size to large networks, although it really depends on
the machine specs. 

 

I decided to try a different approach: to collect raw tcpdump output for a
week, and then feed that data to NTOP. I've done a little experiment with an
1 minute tcpdump file and it seem to work well. 

 

Will this method work for 1 week tcpdump file ? I suspect that the memory
limitation still poses a problem, but I could do post processing in
different machine (i.e. with 1GB Ram). It seems to me that this offline
processing method should need less memory compared with real-time processing
mode.

 

Any feedback from people that has actually done some data processing like
this would be appreciated :)

 

BTW: as I saw in another post, the tcpdump file only worked when one
specific interface is indicated with -i parameter (e.g. tcpdump -i eth0 -w
dumpfile)

 

-pfeito

Reply via email to