Curious, I'm not sure if your issue was resolved, but I wanted to
let you know that I switched to WWW::Mechanize for a simple crawler
that was hitting around 10,000 pages. The script finished in under an
hour, verifying everything I needed. So, depending on your test cases
and requirements,
I've had the same issue while doing a little sanity checking on some
marketing websites that we control. Using IE.new_process helped a
little, but Ruby would always die before the script completed.
On Aug 30, 6:33 pm, Chris McMahon christopher.mcma...@gmail.com
wrote:
looks to me that
Well, if you're crawling tens of thousands of pages you might have
picked up a nasty bug somewhere along the way. I would run some virus
scans. Also run some network diagnostics (try pinging some sites,
etc...)
-Dylan
On Aug 29, 8:04 am, curious csamigr...@gmail.com wrote:
I have massive WATIR