Hi, the compress algorithm was used as best compromise between speed, result and CPU usage. I also did test bzip2 and - yes - you get more out of disk space, but it's more CPU intensive.
Anyway, it would be possible to add another compression along the existing one and let the user choose. So if this is a demand from other people too, we can happily have a lookk together. Cheers - Peter On 05/09/15 16:01, Miroslav Kratochvil wrote: > Hello list, > > I was wondering whether it is possible to use any better compression for > nfdump files than the supplied LZO1X-1. By simple measurements at my site, > the files compressed by the nfdump are shrank to around 38% of the original > size, while using bzip2 > (speed of which is still bearable for many environments) gets to around 20% > of the original. > > For people who need to archive the netflow data, almost two times smaller > archives could be a great space/time saver. CPU usage of the compression is > IMHO not a real issue on current machines. > > Is there any possibility to replace the LZO compression by something else? > I'd happily code a patch for bzip2 support -- is that a viable option, or is > there some reason not to use this specific compression? (I've seen github > migration is underway, > I'd just send a merge request if it'd been finished :] ) > > Thanks a lot in advance, > -mk > > > PS.: > Also, on a related note, when trying to code the compression around, I've > seen this weird error unexplainable by traditional means: > > $ cat < somefile |nfdump -r - > [... data are OK] > > $ bzip2 somefile > $ bzip2 -c -d < somefile.bz2 |nfdump -r - > Date flow start Duration Pr[....] > Read error in file '(null)': Success > No matched flows > > Any explanation? > > ------------------------------------------------------------------------------ _______________________________________________ Nfdump-discuss mailing list Nfdump-discuss@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/nfdump-discuss