postgresql definetely scales also with write intensive operations without blocking. homemade task queues are real funny to code but gets messy really soon.....blocking operations, tasks that fail and need (or don't) to be requeued, priorities, timeouts, newtork splits, and so on. I think I'm not mistaken by saying that celery is the most used library for this kind of operations, it's written in python and founded on rabbitMQ (also on others, but primarily rabbit) to handle queues. Seems huge, but it's fairly easy to setup (especially if you planned to use rabbitmq anyway) and can be used with a few statements to make very simple tasks, but can be extremely fine-tuned for most of the requirements out there. There's a web2py plugin for celery around, but it's coded for having web2py to handle task generation (using db to store queues so they are accessible from web2py), so if you're generating tasks outside maybe it's not worth to look into that. Still nice to have it around to eventually manage celery (worker statuses and so on) within web2py.
Il giorno martedì 15 maggio 2012 21:01:54 UTC+2, cyan ha scritto: > > > I strongly advicee the use of the scheduler because your requirements will >> be fullfilled best from that than a homemade task queue, at least, if not >> trying to use celery.............anyway, just my 2 cents: >> SQLITE write operations are locking the entire database. one of 2 >> controllers (or modules, or scripts, or anything) trying to write into a >> single database will likely to have to wait the other one to finish. >> 1. with every other database that is not sqlite, it's safe because how >> most relational db work, transactions make you have always a consistent set >> of results (if you use them correctly!!) > > > Thanks for all the info! I use PostgreSQL as the db, does that make any > difference in terms of robustness and performance in this case? Are you > suggesting that a beefier db like PostgreSQL *is* going to handle > concurrent reads and writes comfortably? > > 2.you can, but installing rabbitmq just to manage a queue seems an >> overkill. at that point, use celery directly. > > > I choose rabbitmq as the messages containing the updates to the db come > from a separate server (tornado), so the update tasks are originated > remotely but *not* on web2py server. I am not familiar with celery, does it > do the same thing better here? >

