On Mon, Sep 06 2010, Rafal Krypa wrote:
When building multiple jobs, rebuildd daemon die occasionally. The problem does not happen whet max_jobs = 1. Below is the traceback appearing in rebuildd.log: Traceback (most recent call last): File "/usr/sbin/rebuildd", line 50, in <module> Rebuildd().daemon() File "/usr/lib/pymodules/python2.6/rebuildd/Rebuildd.py", line 78, in daemon self.loop() File "/usr/lib/pymodules/python2.6/rebuildd/Rebuildd.py", line 369, in loop self.get_new_jobs() File "/usr/lib/pymodules/python2.6/rebuildd/Rebuildd.py", line 163, in get_new_jobs job.status = JobStatus.WAIT_LOCKED File "/usr/lib/pymodules/python2.6/rebuildd/Job.py", line 64, in __setattr__ self.status_changed = sqlobject.DateTimeCol.now() File "/usr/lib/pymodules/python2.6/rebuildd/Job.py", line 65, in __setattr__ sqlobject.SQLObject.__setattr__(self, name, value) File "<string>", line 1, in <lambda> File "/usr/lib/python2.6/dist-packages/sqlobject/main.py", line 1050, in _SO_setValue dbValue)]) File "/usr/lib/python2.6/dist-packages/sqlobject/dbconnection.py", line 517, in _SO_update self.sqlrepr(so.id))) File "/usr/lib/python2.6/dist-packages/sqlobject/dbconnection.py", line 343, in query return self._runWithConnection(self._query, s) File "/usr/lib/python2.6/dist-packages/sqlobject/dbconnection.py", line 256, in _runWithConnection val = meth(conn, *args) File "/usr/lib/python2.6/dist-packages/sqlobject/dbconnection.py", line 340, in _query self._executeRetry(conn, conn.cursor(), s) File "/usr/lib/python2.6/dist-packages/sqlobject/sqlite/sqliteconnection.py", line 183, in _executeRetry raise OperationalError(ErrorMessage(e)) sqlobject.dberrors.OperationalError: database is locked

Hum SQLite… Well, I don't know if that's easily fixable, but I'd suggest to use another database backend if you really want concurrent access to your database. SQLite just suck at that.

--
Julien Danjou
// ᐰ <jul...@danjou.info>   http://julien.danjou.info

Attachment: pgpy1ArwcZuFk.pgp
Description: PGP signature

Reply via email to