Each log level is limited to a set amount of storage, so if you have
many ads with that that many requests you will probably miss stuff.

I suggest you check out Brett Slatkins pipeline IO talk:
  
http://code.google.com/events/io/2010/sessions/high-throughput-data-pipelines-appengine.html



Robert





On Tue, Jul 20, 2010 at 12:47 PM, Scott Newman <[email protected]> wrote:
> Hello all,
>
> I've got Python ad-serving application that needs to track how many
> times an ad was served daily. I could obviously add a record to the
> datastore, but I'm wondering if logging the request and then exporting/
> parsing it daily would be as effective? This would keep my datastore
> writes way down as I'm looking at about 60-100 million requests/day.
>
> Is the logging consistent enough to rely upon?
>
> Thanks!
>
> Scott
>
> --
> You received this message because you are subscribed to the Google Groups 
> "Google App Engine" group.
> To post to this group, send email to [email protected].
> To unsubscribe from this group, send email to 
> [email protected].
> For more options, visit this group at 
> http://groups.google.com/group/google-appengine?hl=en.
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.

Reply via email to