Hi all,

I've a little program to perform a massive import of data into a sqlite 
database. As the data I process come from internet connections,  I've 
opted to implement it with threads, to not get stalled waiting for each 
internet request to return the data.

The memory grows and grows indefinitely. If I change to postgres the 
problem dissapear. I've searched a lot about sqlobject+sqlite and 
threading without finding a solution.

I've read in main sqlobject doc: "SQLite may have concurrency issues, 
depending on your usage in a multi-threaded environment."

I'll explain my code with a simplified example:

class mySQLObject(SQLObject):
    _connection = ConnectionForUri("sqlite:test.db")
    item = UnicodeCol()

class get_page(Thread):
    def __init__(self, id):
        self.id = id
        Thread.__init__(self)

    def run()
        f = urlopen("http://example.com/article/%d"; % id)
        mySQLObject(item = f.read())


# main
for i in xrange(1,10000):
      t = get_page(i)
      t.start()


Any clue? is it impossible to perform this task as I want to because 
sqlite limitations?


jonhattan

-------------------------------------------------------------------------
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
_______________________________________________
sqlobject-discuss mailing list
sqlobject-discuss@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/sqlobject-discuss

Reply via email to