On 15 Mar 2013, at 7:39 AM, Niphlod <[email protected]> wrote:
> I'd need to test it again, but when I added excel-like exports my users were 
> downloading ~50mb of files without hiccups (MSSQL though).....
> lost connection during query seems to point the finger towards mysql not 
> shipping data fast enough than web2py not fast enough to handle its 
> serialization..... did you try raising the timeout in the driver (or 
> requesting less data) ?

I haven't tried anything yet. 

Re: requesting less data: I could write a custom query function to do that, but 
right now I'm just using the existing UI, which in both cases exports the whole 
table.

How would I go about raising the timeout in the driver? I looked briefly at 
dal.py and it wasn't obvious.

> 
> On Friday, March 15, 2013 3:19:11 PM UTC+1, Jonathan Lundell wrote:
> On 15 Mar 2013, at 6:45 AM, Jonathan Lundell <[email protected]> wrote:
>> I'm trying to export a largish (270K records) MySQL table via the grid or 
>> administrative export-csv functions. In both cases, I get an error after a 
>> fairly long wait; I'm assuming it's a connection timeout.
>> 
>> Traceback (most recent call last):
>>   File "/home/wupadmin/.virtualenvs/watchup/watchup/web2py/gluon/main.py", 
>> line 523, in wsgibase
>>     BaseAdapter.close_all_instances('commit')
>>   File "/home/wupadmin/.virtualenvs/watchup/watchup/web2py/gluon/dal.py", 
>> line 447, in close_all_instances
>>     getattr(instance, action)()
>>   File "/home/wupadmin/.virtualenvs/watchup/watchup/web2py/gluon/dal.py", 
>> line 1430, in commit
>>     return self.connection.commit()
>>   File 
>> "/home/wupadmin/.virtualenvs/watchup/watchup/web2py/gluon/contrib/pymysql/connections.py",
>>  line 562, in commit
>>     self.errorhandler(None, exc, value)
>>   File 
>> "/home/wupadmin/.virtualenvs/watchup/watchup/web2py/gluon/contrib/pymysql/connections.py",
>>  line 184, in defaulterrorhandler
>>     raise errorclass, errorvalue
>> OperationalError: (2013, 'Lost connection to MySQL server during query')
>> 
>> 
>> Should I just increase a timeout value somewhere (and if so, where?), or is 
>> there a better way to do this? FWIW, a gzipped mysqldump of the entire 
>> database is only 22MB, so we're not talking ridiculously large files here.
> 
> FWIW: this is a fairly conventional setup, running RHEL6 on a Rackspace cloud 
> server, with MySQL on a separate virtual machine.
> 
> -- 
>  



-- 

--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to