I'm getting the same thing (or similar) but with SQLite...
i killed anything else that may be connecting to the DB, but now only
thing I can do is select. Can't update, delete, truncate, etc,...
is there something to programmatically unlock a DB? the only thing
curing this is to wipe out the db.
I have 2 scripts running: i that checks for new arrivals of files,
then inserts a record for each file. And another script looks at the
DB and will do stuff when it spots an row with the 'status' field set
to 'queued'. Even if I kill the first script, the second script craps
jout on update()
def checkRecords():
try:
while True:
if len(os.listdir(buildExec))==0:
firstQrow=db((db.Q.id>0)&(db.Q.status=='queued')).select().first()
if firstQrow:
makeFile(lockFile)
id=firstQrow.id
bSpec=firstQrow.buildspec
makeFile('{0}/{1}'.format(\
buildExec,bSpec),firstQrow.content)
db(db.Q.id==id).update(status='running')
db.commit()
execBlueLite(id)
except Exception as err:
print('{0}'.format(err))
On Jul 24, 8:06 pm, weheh <[email protected]> wrote:
> This is probably not a web2py question, but here goes anyway. I'm
> migrating to postgres from having prototyped with sqlite. Besides a
> few instances where I had reference fields defaulting to 0 instead of
> None, the migration went smoothly. BUT, I'm having the database get
> locked up all the time whenever I need to delete and sometimes update
> certain records. Ironically, sqlite doesn't do that even though the
> main impetus for migrating to postgres was because I was under the
> impression that it was less likely to get into a locked condition.
>
> Is there some special setting I need that will cause postgres to be
> less likely to lock up? i'm setting the pool_size=20 already. Anything
> else I need to be doing? Thx.