I'm using dbfpy to read records from a blobstore entry and am unable to 
read 24K records before hitting the 10 minute wall (my process is in a task 
queue). Here's my code:

    def get(self):
        count = 0
        cols = 
['R_MEM_NAME','R_MEM_ID','R_EXP_DATE','R_STATE','R_RATING1','R_RATING2']

        blobkey = self.request.get('blobkey')
        blob_reader = blobstore.BlobReader(blobkey)

        dbf_in = dbf.Dbf(blob_reader, True)

        try:
            if dbf_in.fieldNames[0] == 'R_MEM_NAME':
                pass
        except:
            logging.info("Invalid record type: %s", dbf_in.fieldNames[0])
            return

        mysql = mysqlConnect.connect('ratings')
        db = mysql.db
        cursor = db.cursor()

        for rec in dbf_in:
            count = count + 1
            if count == 1:
                continue

            continue

---
This simple loop should finish in seconds. Instead it gets through a few 
thousand records and then hits the wall.

Note the last "continue" that I added to bypass the mysql inserts (that I 
previously thought were the culprit).

I'm stumped and stuck.

-- 
You received this message because you are subscribed to the Google Groups 
"Google App Engine" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/google-appengine.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/google-appengine/51305160-c82e-412f-9c3d-a4aad8fb9755%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
  • [google-appengine]... Mike Lucente

Reply via email to