On Wed, Nov 23, 2011 at 9:11 AM, Richard Arrano <[email protected]> wrote:
> I see, I'm guessing it probably isn't worth it to optimize this
> particular area but it's good to know that the multithreading ability
> would work in a more complex instance where I truly needed the
> parallelism.
>
> One last question on the topic, having to do with threadsafe: the
> function that I was referring to was actually a decorator that checks
> certain permissions that I insert before a large amount of handlers.
> It also stores the returned objects via self.permissions for example.
> Is there a possibility of a race condition on self.permissions or does
> it function in such a manner that this is impossible?

If you are using webapp2 then a new handler instance is created for
every request.

Cheers,
Brian

> Thanks,
> Richard
>
> On Nov 22, 1:56 pm, Brian Quinlan <[email protected]> wrote:
>> On Wed, Nov 23, 2011 at 8:48 AM, Richard Arrano <[email protected]> wrote:
>> > @Brandon:
>> > This is true but it just would take a lot of rewriting that may or may
>> > not be worth it.
>>
>> > @Brian
>> > Thanks for the tip, I didn't even realize that(I haven't been using
>> > AppStats, shame on me). Would the savings be worth it, in your
>> > opinion, when they're not present in the cache and have to resort to 3
>> > gets of varying size?
>>
>> Its hard to give advice on this kind of complexity vs. performance
>> trade-off without really understanding the application.
>>
>> Datastore gets are slower than memcache gets but are still pretty quick.
>>
>> Cheers,
>> Brian
>>
>>
>>
>>
>>
>>
>>
>> > On Nov 22, 12:37 pm, Brian Quinlan <[email protected]> wrote:
>> >> Hi Richard,
>>
>> >> On Wed, Nov 23, 2011 at 7:18 AM, Richard Arrano <[email protected]> 
>> >> wrote:
>> >> > Hello,
>> >> > Quick question regarding multithreading in Python 2.7:
>> >> > I have some requests that call 2-3 functions that call the memcache in
>> >> > each function. It would be possible but quite complicated to just use
>> >> > get_multi, and I was wondering if I could simply put each function
>> >> > into a thread and run the 2-3 threads to achieve some parallelism.
>> >> > Would this work or am I misunderstood about what we can and cannot do
>> >> > with regards to multithreading in 2.7?
>>
>> >> This will certainly work put I'm not sure that it would be worth the 
>> >> complexity.
>>
>> >> Fetching a value from memcache usually takes <5ms so parallelizing 3
>> >> memcache gets is going to save you ~10ms.
>>
>> >> Cheers,
>> >> Brian
>>
>> >> > Thanks,
>> >> > Richard
>>
>> >> > --
>> >> > You received this message because you are subscribed to the Google 
>> >> > Groups "Google App Engine" group.
>> >> > To post to this group, send email to [email protected].
>> >> > To unsubscribe from this group, send email to 
>> >> > [email protected].
>> >> > For more options, visit this group 
>> >> > athttp://groups.google.com/group/google-appengine?hl=en.
>>
>> > --
>> > You received this message because you are subscribed to the Google Groups 
>> > "Google App Engine" group.
>> > To post to this group, send email to [email protected].
>> > To unsubscribe from this group, send email to 
>> > [email protected].
>> > For more options, visit this group 
>> > athttp://groups.google.com/group/google-appengine?hl=en.
>
> --
> You received this message because you are subscribed to the Google Groups 
> "Google App Engine" group.
> To post to this group, send email to [email protected].
> To unsubscribe from this group, send email to 
> [email protected].
> For more options, visit this group at 
> http://groups.google.com/group/google-appengine?hl=en.
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/google-appengine?hl=en.

Reply via email to