I will just come back and add how i solved this, in case anyone else would
ever want to know.
external_db = DAL(
'postgres://connection_string',
pool_size=20,
after_connection=lambda self: self.execute('set search_path to
my_schema, public; set statement_timeout to 5000;'),
migrate=False
)
Then if I ever need to export large amounts of data that might take more
than 5 second, I simply create a function that set a new timeout for that
call..
On Monday, June 17, 2013 11:20:25 AM UTC+2, hiro wrote:
>
> I am using the DAL to connect to a Postgres server. This server is very
> big and under constant heavy load.
>
> The problem I am experiencing is that from time to time when the server is
> under really heavy load, or when the tables queried from the DAL are
> currently being recreated, the server just wait.
>
> This causes web2py to just wait as well and if the user try to make a few
> of these calls the web2py server just dies and I have to restart it.
>
> What I would like is some way of making sure no query take more than 3
> seconds. If they do the query should be aborted and and error should be
> raised. What is the best way to implement this?
>
>
>
>
--
---
You received this message because you are subscribed to the Google Groups
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
For more options, visit https://groups.google.com/groups/opt_out.