On Jun 10, 1:58 am, tony <[email protected]> wrote:
> So much to learn, so little time

My feelings exactly.

On Jun 9, 6:51 pm, "[email protected]" <[email protected]> wrote:
> Good thing you exist. Thanks. It's fixed and the new code is on
> github.

Thanks, I echo that sentiment (the first part) at you and the
TiddlyWiki/Web community.


So my rocky trip through database land continues.

The test machine I'm using at work has 256 MB of RAM so I've had a
close eye on the number of running processes and memory usage. With
the basic TiddlyWeb environment setup I started adding, deleting, and
editing tiddlers and discovered a problem...

Every single request to TiddlyWeb is creating a new connection to
postgre and the connections are rapidly eating up available memory. I
did pretty much all my testing in postgre, but a quick check using
mysql revealed the same behavior.

I reduced the max_connections cap to 10 in postgre to test further.
Since any request (GET, PUT,...) creates a new connection, the
connection limit is hit within a minute or so of doing stuff in
TiddlyWeb. Edit a tiddler 5 times and there will be 5 new processes.
Hitting the connection limit causes sqlalchemy to crash with a
friendly message from the database:

FATAL:  connection limit exceeded for non-superusers

First I was confused why existing connections weren't being reused. I
looked at the sqlalchemy docs and by default create_engine() uses
connection pooling with pool_size = 5, so there should never be more
than 5 idle processes listening for connections. When I first noticed
the problem I had around 40 separate open connections/processes, most
idle.

Now I have a good idea why the connections are proliferating. On every
request TiddlyWeb initializes the wrappers, and the Store's __init__
method is among those invoked. That means every request calls the
Store's __init__ method and creates a new session and database
connection with its own connection pool. The problem is completed
requests leave behind idle sessions that never get reused because
subsequent requests go and create new connections anyway.


After skimming the sqlalchemy docs on Sessions a bit, I think there
are a couple options. One option I'm confident will work is to clean
up the session created in each request by calling Session.remove()
after the request is done. An alternative (not sure if it's possible)
would be to try to connect a session to an existing connection created
by a previous request.

Considering the Session.remove() approach, the question is where to
make the call to remove? It needs to be called after the database
accesses are done. One way might be to put the call in a wrapper that
gets called after all Store related work is done. I though it might be
possible to use a destructor, a __del__ method, to do it, but I found
the destructor is not called after each request. That itself raises
some questions.

Anyway before getting into too much detail about any possible
remedies, I'll wait for your take on the situation.

interestingly I'm getting the impression that you're not getting this
problem using the SQLite database on peermore, because the site seems
to be doing okay. I think I saw somewhere that SQLite, being a single
file, handles multiple connections differently so that might account
for it. Either that or your server has a lot of memory and the
incremental increase in memory usage hasn't had an effect yet. If you
are getting new connections on each request you may be in for an
unpleasant surprise somewhere down the line.


On Jun 10, 1:58 am, tony <[email protected]> wrote:
> On Jun 9, 2:20 am, Oveek <[email protected]> wrote:
>
> > Nice work. Any particular reason you used cygwin on Windows?
>
> I figured I should test on as many configurations to my avail.
> Plus, unfortunately, computing is imprisoned by MS product in many
> corporations and I didn't know how to work with twanager with DOS or
> Powershell.
> Another consideration is that I carry the sql store on a USB flash
> drive which affords some portability.
> I didn't try this with the text store but I found surprisingly that
> TiddlyWeb with sql works on Chrome2, Safari4, FF3 with no need for jar
> files. :-)
> The only bits missing for me are having the ability for import/export/
> sync with sqlite, learning up in revisions/roles/policies/multistore
> and possibly interoperability with TiddlyWiki and Cocoa touch.
>
> So much to learn, so little time
>
> be fun!
>
> Best,
> tony
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"TiddlyWikiDev" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/TiddlyWikiDev?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to