On Wed, Jun 11, 2008 at 5:26 AM, Neil Mock <[EMAIL PROTECTED]> wrote: > Forgive me if this has been addressed somewhere, but I have searched and > can't come up with anything. > > I am basically trying to distribute several web page scraping tasks among > different threads, and have the results from each added to an Array which is > ultimately returned by the backgroundrb worker. Here is an example of what > I'm trying to do in a worker method: > > pages = Array.new > > pages_to_scrape.each do |url| > thread_pool.defer(url) do |url| > begin > # model object performs the scraping > page = ScrapedPage.new(page.url) > pages << page > rescue > logger.info "page scrape failed" > end > end > end > end > > return pages > > From monitoring the backgroundrb logs, it appears that all of the pages are > completed successfully in the threads. However, the array that is returned > is empty. This is to be expected I suppose because the threads don't > complete before the array is returned, but my question is: how can I make > the worker wait to return the array only when all of the threads are > complete? >
Neil, I have a solution for you in git version: http://gnufied.org/2008/06/12/unthreaded-threads-of-hobbiton/ _______________________________________________ Backgroundrb-devel mailing list [email protected] http://rubyforge.org/mailman/listinfo/backgroundrb-devel
