I'm answering myself because I found a solution to my problem, and I
wanted to share the results with the list, in case someone is struggling
to try to do the same thing...

The problem with mysql (go easy, we're not talking about the fact that
mysql is primitive and overall does not match up to the known
transactional rdbms) is that it does not provide any sql statement
profiling or timing statistics. If you use the mysql client to check a
query, you can see a time. You can even write a perl script utilizing the
benchmark module that comes with mysql to get more sql timings.

However, neither of these methods work with Zope (i.e. do you really feel
like interfacing to perl from Zope, and then calling the benchmark module
for EVERY one of your queries?). A more "automated" approach is desirable,
that compares to Oracle's profiling that logs ALL queries to a file with
timing statistics.

I finally decided after searching for a few hours for information, that I
would modify Zope to handle this for me. The logical place was to put it
in ZMySQLDA. I'm attaching a patch of a really _simple_ logging
mechanism. The only activity that really occurs is the time is noted
before and after the query gets processed and fetched, and the difference
is noted. Finally, it gets logged to /tmp/mysql_timing.log --- You need to
turn off MYSQL_TIMING in db.py of ZMySQL after applying the patch should
you not want the timing to occur any longer. And as usual, a Zope restart
is required after patching your ZMySQLDA.

The patch was written for ZMySQLDA 1.2.0 (nothing below).


On Mon, 9 Oct 2000, knight wrote:

> Greetings,
> I have a lot of past experience with tuning and timing on Oracle (quite
> simple), but I seem to have hit a hurdle with timing my sql statements in
> MySQL. Getting these timings are _incredibly_ important to finding
> bottlenecks in a web site's performance...
> Now, first, is it possible to turn on profiling for ZSQL methods? I've
> turned on my Zope's profiling, and I can see statistics on how many times
> sqlvar was called, but not specific zsql methods. Second, anyone familiar
> with any mysql settings that will dump a log displaying real-time sql
> executions and the corresponding execute time?
> I like to turn sql timing on, then run around the site for a while, send
> some people to test it out, or throw up a http load tester script up, and
> then check the timing logs to see how things worked out.
> Regards,
> Knight
> _______________________________________________
> Zope maillist  -  [EMAIL PROTECTED]
> http://lists.zope.org/mailman/listinfo/zope
> **   No cross posts or HTML encoding!  **
> (Related lists - 
>  http://lists.zope.org/mailman/listinfo/zope-announce
>  http://lists.zope.org/mailman/listinfo/zope-dev )
--- db.py       Thu Jun 15 02:10:41 2000
+++ db.py       Mon Oct  9 18:06:53 2000
@@ -142,6 +142,12 @@
     def __init__(self,connection):
+        self.MYSQL_TIMING = 1
+        if self.MYSQL_TIMING:
+          self.timinglog = open('/tmp/mysql_timing.log', 'w+')
+          if not self.timinglog:
+            print "Error: Can't open /tmp/mysql_timing.log for writing"
+            self.MYSQL_TIMING = None
         if info and len(info) < 4:
@@ -174,6 +180,7 @@
               select_m=regex.compile('[\0- ]*select[\0- ]+',
+        import time
             queries=filter(None, map(strip,split(query_string, '\0')))
@@ -181,11 +188,22 @@
             for qs in queries:
+                if self.MYSQL_TIMING:
+                  start_time = time.time()
+                    if self.MYSQL_TIMING:
+                      end_time = time.time()
+                      tot_time = end_time - start_time
+                      tmpstr = "SQL Statement: " + qs + "\n"
+                      self.timinglog.write(tmpstr)
+                      tmpstr = "Execution Time: " + str(tot_time*1000) + " 
+miliseconds (ms)\n\n"
+                      self.timinglog.write(tmpstr)
+                      self.timinglog.flush()
                 except: r=None
                 if not r: continue
                 if result:

Reply via email to