> -----Original Message-----
> From: Joseph Bueno [mailto:[EMAIL PROTECTED]
> Sent: Tuesday, March 25, 2003 1:36 AM

..

> If you want to speed it up, you have to make it use an index.
> You need to add a WHERE or an ORDER BY clause.
> Have you tried :
> SELECT Message_ID, Body FROM Body_etc ORDER BY Message_ID LIMIT N,M
>
> Of course, I assume that Message_ID is indexed ;)

Message_ID is the primary key.  But your suggestion doesn't help.  If
anything, it is slower.

However, I think I've figured out the right way to do this -- use a
server-side cursor.  I can completely get rid of the need for a LIMIT in the
SELECT statement.  I've never used server-side cursors before, so I am a bit
surprised to see that even when I do a SELECT for all 1.5 million records,
MySQL's memory usage doesn't increase a bit above where it was when I was
doing the same queries using a normal cursor.  All I have to do is figure
out how many records I can safely insert at one shot, which is not a
problem.

Just noticed something odd, though, with the MySQLdb SSCursor.  When close()
is called, it does a fetchall(), getting any records that you hadn't
retrieved, trying to load all of them into memory.  It's actually calling
nextset(), even though MySQL doesn't support multiple result sets.

Nick


-- 
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]

Reply via email to