"Curious", I'm not sure if your issue was resolved, but I wanted to
let you know that I switched to WWW::Mechanize for a simple crawler
that was hitting around 10,000 pages.  The script finished in under an
hour, verifying everything I needed.  So, depending on your test cases
and requirements, this might be a viable solution for you.  The
WWW::Mech version of my script was just over 8 lines; pretty simple.

Adam

On Sep 8, 11:12 am, AR <reed.a...@gmail.com> wrote:
> I've had the same issue while doing a little sanity checking on some
> marketing websites that we control.  Using IE.new_process helped a
> little, but Ruby would always die before the script completed.
>
> On Aug 30, 6:33 pm, Chris McMahon <christopher.mcma...@gmail.com>
> wrote:
>
> > > looks to me that something must be stuck somewhere, and I need to
> > > clear that away, but I don't know what I need to clear away.. I
> > > restarted the computer, but the problem does not go away.
>
> > > Could anyone tell me what I need to clear away in order to have smooth
> > > internet access??
>
> > I seem to recall that Ruby under some circumstances will hog memory
> > until Bad Things Happen.
>
> > If you're crawling all these pages and saving bits of information, be
> > sure you're writing out to a file along the way and flushing the write
> > buffer when you do it.
>
> > -Chris
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Watir General" group.
To post to this group, send email to watir-general@googlegroups.com
Before posting, please read the following guidelines: 
http://wiki.openqa.org/display/WTR/Support
To unsubscribe from this group, send email to 
watir-general-unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/watir-general
-~----------~----~----~----~------~----~------~--~---

Reply via email to