No, unfortunately that doesn't help either.

Basically, there will be one or even two tasks for every user request
but only a small amount of them are not allowed to run in parallel. So
tasks will be coming at a faster rate than user requests. I need a
system to threat them in batches and filter those which should not run
at the same time and run the batch in parallel for speed.

Example :

queue of 25 tasks. 2 of those tasks of a specific type(group) should
not run in parallel. and another 3 tasks of an other type should not
run in parallel. (The group types are two many to make individual
queues for them)

So what needs to be done is to kick out one of the tasks from the
first group and 2 from the second groups so that the batch has no two
tasks of the same group. then execute that batch in threads and move
on to the next.

The only way I see to do that in a clean way is a pull queue and a
backend that is always on which is costly. It could be done much more
efficiently if it were a queue feature. Such a thing would eliminate
all risks of reading/modifying/writing at the same time a creating
inconsistent data.

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.

Reply via email to