I am still faced with the problem of being able to handle a huge amount of data. After trying several routes to a solution, I'm down to this final issue...I need to be able to process a large amount of updates per second.
My benchmark is 250 updates per second. Any less and I won't be able to keep up with the volume of inbound metrics data being collected. I have rrdcached implemented, so I'm routing all my calls to the daemon socket. Disk I/O shouldn't be a problem since I'm passing that off to rrdcached to deal with whenever it needs to. I've tried threading the updates using Perl threads, and the best I can achieve is about 20 per second. This is doing system calls to rrdtool update. I've tried the RRDs perlmod, but it crashes with a segfault (due to the lack of thread safeness). I have tried multiple approaches within the perl script (system calls with an &, fork and exec, etc), but none work any better. Is there anyone that has had success in dealing with large numbers of updates per second, and if so, what solution are you using? -- View this message in context: http://rrd-mailinglists.937164.n2.nabble.com/Multiple-Updates-Per-Second-tp7580505.html Sent from the RRDtool Users Mailinglist mailing list archive at Nabble.com. _______________________________________________ rrd-users mailing list [email protected] https://lists.oetiker.ch/cgi-bin/listinfo/rrd-users
