I'm trying to export a largish (270K records) MySQL table via the grid or 
administrative export-csv functions. In both cases, I get an error after a 
fairly long wait; I'm assuming it's a connection timeout.

Traceback (most recent call last):
  File "/home/wupadmin/.virtualenvs/watchup/watchup/web2py/gluon/main.py", line 
523, in wsgibase
    BaseAdapter.close_all_instances('commit')
  File "/home/wupadmin/.virtualenvs/watchup/watchup/web2py/gluon/dal.py", line 
447, in close_all_instances
    getattr(instance, action)()
  File "/home/wupadmin/.virtualenvs/watchup/watchup/web2py/gluon/dal.py", line 
1430, in commit
    return self.connection.commit()
  File 
"/home/wupadmin/.virtualenvs/watchup/watchup/web2py/gluon/contrib/pymysql/connections.py",
 line 562, in commit
    self.errorhandler(None, exc, value)
  File 
"/home/wupadmin/.virtualenvs/watchup/watchup/web2py/gluon/contrib/pymysql/connections.py",
 line 184, in defaulterrorhandler
    raise errorclass, errorvalue
OperationalError: (2013, 'Lost connection to MySQL server during query')


Should I just increase a timeout value somewhere (and if so, where?), or is 
there a better way to do this? FWIW, a gzipped mysqldump of the entire database 
is only 22MB, so we're not talking ridiculously large files here.

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to