I don't know if the solution allows a user to pole more than once in 30 seconds 
and i don't know if in fact you need to know if someone has polled multiple 
times in 30 seconds or keep an audit of all the historical poles - these 
questions would all change how this could and should be done.

If you may have upto 1000 users polling in one 30 second segment then you would 
have a list of 33 user refs in a list stored in each pole. I also dont know 
what information must be stored when they pole - if you need time and user ref 
or just a ref or if in fact you need more information.
mcache may loose data - though  i am surprised some are saying this could 
happen in this scenario - so again i don't know how accurate this must be and 
if in fact a pole was lost from time time whether that would be a problem 
especially if of course the same users are polling continually and being added 
to other poles. speed over accuracy?

Your first suggestion of using a model is still a good one tho 1000 entities 
being read every n seconds is hefty in my opinion. especially if you need to 
read user data from db and not just get the key. In the pole scenario a 
dictionary could be used and when the user poles their data read and stored in 
the pole in mcache at the time of polling as i have documented, which would 
save re-loading data from db when your task runs.

That said in addition to my last post with the mcache 30 pole solution - a bug 
fix is below for working the pole no 1-30 i did mention i ran out of time on 
this method.

The key for the pole would be made as follows

second = #get the min interval of the current time from server
if second>29
    second=second-30
#this gives us 0-29 and then 0-29 again for one min 
then make the mcache key as
key='poles%' %(second)

If 30 poles are too many use 15 - one for each 2 second element or  10 one for 
each 3 second element - again their are accuracy concerns here 30 being the 
more accurate this would be down to what you need from this routine.

If you have a need for the user data stored in the list - their is no need to 
merge it? just load each pole and iterate it doing whatever function you need 
to pursue? Their is no reason to merge the 30 poles for what reason? 

Thats just making extra work.

This is the only obvious way i can see of using mcache to perform such a task - 
in comparison to using a model and fetching entities with a time zone.

Martin


      

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.

Reply via email to