On Nov 16, 11:03 pm, "Russell Keith-Magee" <[EMAIL PROTECTED]>  
wrote:
> I know this is a horribly nebulous question (like all benchmarking),
> and it's completely dependent on the speed of your machine and a
> million other factors. However, if we are going to start adding
> signals to very common operations (like m2m and opening connections),
> we need to know what sort of overhead we are adding in absolute terms.

Okay, I decided to do a bit of profiling to keep the conversation  
moving.  None of this is particularly rigorous stuff, but I figure  
it's a start.

(The uncertainties shown below are the standard deviations across  
repeated runs divided by the square root of the number of runs, with  
ten-percentile thresholds for outlier rejection, all of which makes  
for a rough but useful measure.  This was all on my development  
machine, not a pristine test environment (which I don't have), so lord  
knows what external factors may have been involved.)

I began by examining the impact of adding the connection_created  
signal in a toy end-to-end test, with a local client requesting a  
reasonably normal page from a simple blog app, using sqlite3 and the  
development server.  The signal listeners here were pretty simple,  
just incrementing a connection counter.  Here is what I saw.

   Page request performance, in seconds per 100 queries:
   trunk (no listeners):  2.624 +/- 0.009
   6064 (no listeners):   2.624 +/- 0.005
   6064 (1 listener):     2.624 +/- 0.006
   6064 (100 listeners):  2.680 +/- 0.005

The differences (or lack thereof) between the first three timings here  
are not statistically significant, but it is clear that by the time I  
get to 100 listeners I'm seeing a real performance hit.  In order to  
better measure the impact of adding the signal even when no listeners  
are present, I zoomed in on the database connection/disconnection  
cycle itself (again using sqlite3 with a database on disk).

   Database connect/close performance, in seconds per 10000 cycles:
   trunk (no listeners):  1.5664 +/- 0.0010
   6064 (no listeners):   1.5895 +/- 0.0011
   6064 (1 listener):     1.7115 +/- 0.0015
   6064 (100 listeners):  4.5087 +/- 0.0036

So, adding the connection_created signal had a roughly 1% impact on  
the speed of the connection/disconnection cycle, and adding a fairly  
simple listener to that signal resulted in a total of a 9% impact.

Now we can put the pieces together to get a little bit of  
perspective.  Evidently the database connection/disconnection cycle  
accounts for roughly 0.6% of the processing time for the query in  
question (on this computer, with this configuration).  We can use this  
to estimate the impact on the page request performance.

   Estimated impact of 6064 on page request performance:
   6064 (no listeners):  0.009%
   6064 (1 listener):    0.055%
   6064 (100 listeners): 1.12%  (measured 2.13% +/- 0.54%)

There's a fair amount of hand-waving involved in those estimates, but  
by looking at the 100-listener case (where the impact on page requests  
was measurable) we can see that the estimate lands us within a factor  
of two of the measured value.  The difference could be explained in  
any number of ways, but the upshot is that a true impact of 0.02% or  
more on page requests for the no-listener case would be very plausible.

Matt

Matt Hancher
Intelligent Systems Division
NASA Ames Research Center
[EMAIL PROTECTED]


--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django developers" group.
To post to this group, send email to django-developers@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/django-developers?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to