Hi all, I'm trying to give a web interface to a crawler build on mechanize. I've set up the model with Page and Link classes and I'm adding now the Crawl class which should incorporate the logic of the cralwer. Basically a Crawl would be a collection of pages that mechanize is getting. What i want is to start a background process that gets a page and its links which then are followed recursively up to a user defined leve. What is the best way to handle this request? run_later in controller, run_later in Model, .defer_to in the router? What are the differences? and most importantly, where does the crawler logic go? Model, Controller or Helper?
Thanks for any reply Giovanni --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "merb" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/merb?hl=en -~----------~----~----~----~------~----~------~--~---
