As our stats table grows beyond several million records I'm starting to seriously question the wisdom of direct inserts into the stats table. The dangers of such a tight coupling are glaringly apparent when we run reports or Db maintenance on the stats table and the site slows to a sloooooooooooow crawl as the inserts compete for Db locks.
So here's my question gang. Has anyone else experienced this problem? If
so, what have you done about it? My first thought is to log to a text
file and then BCP the data into Stats on a short schedule.
We've been looking at some sort of aggregation function that truncates the stats table and records aggregates of a specific periodicity. Obviously the aggregate would involve the loss of some information.
In truth, for larger accounts we simply disable logging -- an insert every page request can be a significant load on an application. In these instances we rely on the log files with a more traditional log analysis program.
In the short term, I would recommend truncating the stats table, to hold only that information that is relevant. Perhaps the last month or last quarter.
-- geoff http://www.daemon.com.au/
--- You are currently subscribed to farcry-dev as: [EMAIL PROTECTED] To unsubscribe send a blank email to [EMAIL PROTECTED] Aussie Macromedia Developers: http://lists.daemon.com.au/
