On Jun 11, 5:17 am, Oveek <[email protected]> wrote:
> The upshot is that on either mysql or postgre with default setings,
> you are going to hit your max connection limit in under 200 requests.
> If the max_connections setting was sky high, eventually memory would
> be the limiting factor. On my test system it takes between 8 and 18
> new TiddlyWeb requests / new database connections to consume one
> additional MB of memory.
I can start doing similar testing on my end with mysql and hopefully
we'll meet in the middle.
> The second post in that thread has a list of calls that can be used to
> fully close down a connection opened for a session.
That's a useful tidbit.
> Still the question remains where is the best place to do this
> connection closing? It looks to me there are generally two ways to
> handle it. One in the sqlstore, where each handler does its own
> closing; or two outside the store with a single call to the connection
> closing code.
Maybe in tiddlyweb.web.wsgi.StoreSet:
def __call__(self, environ, start_response):
database = Store(environ['tiddlyweb.config']['server_store']
[0], environ)
environ['tiddlyweb.store'] = database
output = self.application(environ, start_response)
# XXX completely untested pseudo code!!!
try:
database.storage._close_connection()
except AttributeError:
pass # current storage has no _close_connect
return output
'storage' is the particular StorageInterface implementation object.
Something like that may be a good stopgap, but doesn't reuse the pool.
I haven't tried the above. If you get a chance to try it before me,
let me know how it goes.
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups
"TiddlyWikiDev" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/TiddlyWikiDev?hl=en
-~----------~----~----~----~------~----~------~--~---