> I strongly advicee the use of the scheduler because your requirements will 
> be fullfilled best from that than a homemade task queue, at least, if not 
> trying to use celery.............anyway, just my 2 cents:
> SQLITE write operations are locking the entire database. one of 2 
> controllers (or modules, or scripts, or anything) trying to write into a 
> single database will likely to have to wait the other one to finish.
> 1. with every other database that is not sqlite, it's safe because how 
> most relational db work, transactions make you have always a consistent set 
> of results (if you use them correctly!!)


Thanks for all the info! I use PostgreSQL as the db, does that make any 
difference in terms of robustness and performance in this case? Are you 
suggesting that a beefier db like PostgreSQL *is* going to handle 
concurrent reads and writes comfortably?

2.you can, but installing rabbitmq just to manage a queue seems an 
> overkill. at that point, use celery directly. 


I choose rabbitmq as the messages containing the updates to the db come 
from a separate server (tornado), so the update tasks are originated 
remotely but *not* on web2py server. I am not familiar with celery, does it 
do the same thing better here?

Reply via email to