On Fri, Jul 9, 2010 at 12:42 PM, Christopher Bare
<[email protected]> wrote:

> I'm not really convinced that I want to ship executable task objects
> across a queue. I had pictured sending a blob of parameters as a
> dictionary or JSON object.

You aren't shipping executable task objects across a queue.  You're
passing a message saying to run a task with a given name and these
parameters.  By default, the serialization is done using pickle, but
if you want to use JSON, you can do that as well.  See
http://celeryq.org/docs/faq.html#is-celery-dependent-on-pickle

> And, I pictured the workers as being a
> little more purpose specific, for example loading up big chunks of
> data that will be used for all tasks.

You can do this by writing a custom loader and defining a
on_worker_init.  See
http://celeryq.org/docs/reference/celery.loaders.base.html

--Mike

Reply via email to