Hi all.
In my app, I use SQLObject with caching enabled normally like this:
in model.py:
class MyDataClass( SQLObject ):
class sqlmeta:
cacheValues = True
foo = UnicodeCol()
bar = IntCol()
...
However, it is needed to process many insertions / updates in
transaction
with this class now. I found cacheValues must be False with transaction,
so I wrote code at first like this:
in controllers.py:
def save():
hub = PackageHub( 'myapp' )
MyDataClass.sqlmeta.cacheValues = False
try:
hub.doInTransaction( do_many_insertions, item_list )
except:
import traceback
traceback.print_exc()
finally:
MyDataClass.sqlmeta.cacheValues = True
def do_many_insertions():
for item in item_list:
# actually, multiple SQLObject classes are used.
MyDataClass( foo = item[ 'str' ], bar = item[ 'num'] )
...
I don't think this is correct method.
cacheValues should not be modified in each thread?
Moreover, do_many_insertions function takes long time,
so it is impractical to lock thread before start transation
for blocking another threads.
Do I have to gave up using cache, or to gave up using
MyDataClass and rewrite the function with raw SQL?
Tell me another way if exists.
--
Yasuo Shirai
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups
"TurboGears" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at
http://groups.google.com/group/turbogears?hl=en
-~----------~----~----~----~------~----~------~--~---