On Sat, Apr 12, 2008 at 1:57 AM, Raghu Srinivasan <[EMAIL PROTECTED]> wrote: > > On my site, a user enters an RSS feed to be processed and since this takes > about 5-10 secs, I pass the process off to a background job and meanwhile do > some Ajaxy spinners and tap-dancing until the job completes and then > redirect_to the appropriate page. This works great.
Tap-dancing, LOL. I love that. > Next, is there a way around this? Can I have 2 threads/processes/ports/etc > for Bdrb so that the batch job doesn't interfere with a live user's > experience. Or any other workaround for this? Right now if the web user > comes along when 50 jobs are left and each job takes 10 secs, then he has a > nearly 10 minute wait, which sucks. In my opinion you should always separate scheduled long running processes from user spawned ones. What I would do in this case is extract the common RSS processing functionality into a class in your Rails lib directory, then create two different BackgrounDRb workers that make use of that class. One would be UserRSSWorker and the other could be ScheduledRSSWorker. The first should only be used for user requests (and in addition you should use thread_pool.defer to allow multiple requests at once, which also might solve your original problem with one worker) and the other can be set up on your schedule. Also I am not sure how to do it offhand, but you should try to set up the ScheduledRSSWorker so that BackgrounDRb instantiates it fresh to run and then kills it once it is done, since you don't need it sitting around all day doing nothing. But as I said above in parenthesis in general if you want a worker to be able to handle many jobs at once, use thread_pool.defer. Be sure to read the documentation because since this is threaded code there are things you need to be careful about. Ryan _______________________________________________ Backgroundrb-devel mailing list [email protected] http://rubyforge.org/mailman/listinfo/backgroundrb-devel
