Sorry for adding to the noise, but I'd like to vote this one up one. I've been using ARCHIVE for massive logging projects for over a year now. I was tracking a few billion rows of (poory packed) data per month and doing various analysis on it without too much trouble.

Break down the tables by day, be careful of how you decide to analyze, use summary tables, etc.

-Dormando

Todd Lipcon wrote:
Why not use the ARCHIVE storage engine in MySQL to get fast inserts into the database, and then cron them from there? ARCHIVE supports very fast insert performance, and it shouldn't be a big database load since it's all sequential IO. You could even put this table on a separate MySQL server with 250G 7200RPM RAID 1 or something - no need for anything fancy.

-Todd

On Sun, 4 Nov 2007, [EMAIL PROTECTED] wrote:

I am trying to figure out problem, how to effective track click statistics like user profile views without hitting database or writing statistics to flatfile and processing by cron. It would be best to save them directly into memcache so they can be globally available, then run in periods database updates. The problem is that there could be thousands of different profiles to count stats for
so using increment function is not an option.
Best would be if memcache would support "append" so I could save all hit IDs under one memcache key and then process the list to count frequency of IDs and issue db update. So my question is, how do I handle such cases to have statistics written into shared place to process later DB update?

Goodwill


Reply via email to