Whew... I think my server would choke... :o) Thanks for the info on the procedure, however.
Rick -----Original Message----- From: Cutter (CFRelated) [mailto:[EMAIL PROTECTED] Sent: Tuesday, February 20, 2007 3:22 PM To: CF-Talk Subject: Re: Best Practices for Web Site Traffic Tracking I've seen dedicated systems solely for parsing the logs. Can chew up a lot of CPU resource. But, they (WebTrends) do have a hosted service that only requires you adding a small scriptlet to pages. They work out pricing according to your projected page hits, then incrementally raise you if you go over your projections. Takes a lot of load off of your systems, but does require some budget planning. When a user comes to a site we look for a cookie with an id. If it exists we set a session var with the value (never checking for the cookie again). If it doesn't, then we set the cookie and the (user) session var. We also test for an unexpired 'session' id. If one doesn't exist, then we set it. We then record the 'session' id and user id (once per visit), and get the new record's id (for foreign key). Every page click gets recorded to a table, including the timestamp, page name, query string, site (since we do more than one), and the foreign key tying it back to a session and user. We can then query on the foreign key, ordering by timestamp, and see each page hit within the user's session, from start to finish. We can, within the session, record other information about a user (from a form submission or whatever) that we also tie back through the foreign key. This now helps us tie specific user information (when available) to a specific session, knowing what time they arrived, time on site, time on page, what products they looked at, what forms they submitted, etc. Chews up a lot of CPU. Cutter ___________________ http://blog.cutterscrossing.com Rick Faircloth wrote: > You've hit on one of the reasons I've asked about this. > > I don't have any really high-volume sites, but for some of the > sites that I do track, the database entries really pile up over > the years. And I haven't built in a system for archiving. > > So when I go to get stats, it takes a loooong time for the > database (MySQL, not dedicated) to serve up the results. > And it spikes the CPU usage, as well, which impacts serving sites. > > One of the problems with my system is that *all* the stats are > re-created with *every* request. For instance, cumulative stats > are always calculated, even if I'm just asking for stats for a certain date. > I know seems like a stupid way to go about the setup, but it > wasn't an issue when the database wasn't wasn't so full. But now I've > got one site that has 970,000 records for traffic that the system has > to parse each time a request is made for *every* type of information: > cumulative visits per page, total unique visitors today, visits today by page, > etc., etc... > > And beyond the performance issues, I know there is other data that > I could use like tracking movement through a site, but I don't know > how to go about doing that. > > With so much on my plate, I just *hate* the thought of re-building the > system, but it looks like I'll have to. I thought I'd gather some > *best practices* before I begin. > > So, WebTrends doesn't use a database? They just parse server > logs to get the info? Sure would take a lot of work off the CPU > and database. > > Rick > > > -----Original Message----- > From: Cutter (CFRelated) [mailto:[EMAIL PROTECTED] > Sent: Tuesday, February 20, 2007 2:37 PM > To: CF-Talk > Subject: Re: Best Practices for Web Site Traffic Tracking > > Rick, > > Last week we deployed new code, here at work, redesigning our > application and session startup and management, specifically for > improving our own click-through user/session tracking on our client's > sites. Now, we're talking about a shared application templated system > that services 1600+ sites (separate app for each site, same code base), > but I can tell you that we have a single, dedicated MS SQL > server/machine that is solely used for stats input, processing, and > reporting. We have greatly improved (or, our DBA has) how we process > this data, but it does chew up a ton of time and resources, with extreme > care and attention being paid to indexes and table access locks. If you > are maintaining a large, high traffic site, then I would definitely > weigh your options carefully. Rolling your own stats can be very > beneficial, especially if you need the ability to create specific > information tracking (like clicking through steps in a Flash or Flex > application), but if you don't require that sort of degree of > specialized tracking then you may be much better off with balancing your > application structure (page names, sub directory pathing, etc.) and > using something that does log parsing (like WebTrends). > > Cutter > ________________ > http://blog.cutterscrossing.com > > Rick Faircloth wrote: > >>Hi, all. >> >>Anyone know of a discussion/tutorial on the best way to go about >>creating a website traffic management/reporting system? >> >>I've been using my own methods for several years, and, while they >>work well enough, I know there are bound to be better methods. >> >>I'm looking not just for coding, but for overall discussion of >>traffic management, such as archiving older data, tracking visitors >>movement through a website. just about anything. >> >>Of course, I'm look for CF-based info. >> >>Rick >> >> >> >> > > > > > ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~| Upgrade to Adobe ColdFusion MX7 The most significant release in over 10 years. Upgrade & see new features. http://www.adobe.com/products/coldfusion Archive: http://www.houseoffusion.com/groups/CF-Talk/message.cfm/messageid:270232 Subscription: http://www.houseoffusion.com/groups/CF-Talk/subscribe.cfm Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4

