Hi,I have a Turbogears 2.1.2 app that kicks off a long running
parallel task. The task is run in many threads and on several
machines,and they all need to use the model to update the database.
What are the best practices for running DBSessions in threads? I
found somewhere that I should create 1 scoped_session per thread, and
I am generally doing that with the following initialization code:
engine = create_engine('mysql://user:pass@localhost:3306/
foo',echo=False)
DeclarativeBase = declarative_base()
maker = orm.sessionmaker(autoflush=True, autocommit=False)
DBSession = orm.scoped_session(maker)
maker.configure(bind=engine)
Is that right?
Since I'm creating a new session and not using the one from the model,
do I need to commit()? I assume it is not safe to pass around
sqlalchemy results between threads. Do I need to explicitly close()
the session when I am done in each thread? Will I run in to problems
if 20 different threads with 20 different sessions are accessing the
database at the same time? I thought I saw some kind of config
regarding connection_pool...is this related?
Would it be better to create a singleton that holds the session and
then instantiate that wherever I need it? I have done that with
SQLObject before and it worked great.
Sorry for all of the questions. I have been trying all kinds of
things and I can't seem to get it quite right. Thanks.
~Sean
--
You received this message because you are subscribed to the Google Groups
"TurboGears" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/turbogears?hl=en.