For large jobs, I've tended to put the actual job payload in storage and put
just the ID of that storage bin in beanstalk.Works well :)

On Wed, Aug 26, 2009 at 6:39 PM, Keith Rarick <[email protected]> wrote:

>
> On Tue, Aug 25, 2009 at 10:00 AM, Ian Eyberg<[email protected]> wrote:
> > I have just delved into this recently but I noticed that when I changed
> my
> > max job size from the default of 65535 to 2097152 (around 2 meg) my
> performance
> > hit the toilet -- what are some max job sizes that you guys use?
>
> I always make small jobs (around 100 bytes), but there is no reason,
> in principle, not to make big jobs.
>
> > Maybe someone can tell me whether or not beanstalk should even be
> considered
> > when it comes to larger job sizes of 2meg or so?
>
> I simply haven't done any performance testing of large jobs. I'm sure
> there are improvements to be made. Here's a ticket:
>
> http://github.com/kr/beanstalkd/issues/#issue/18
>
> I'll try to get to it soon after releasing 1.4.
>
> kr
>
> >
>


-- 
I died in my dreams, what's that supposed to mean

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"beanstalk-talk" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/beanstalk-talk?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to