Hi cmcmahon, it looks like you've gone to a lot of work.  I won't say it was 
for nothing.  Thank you for trying things out that might help me.  It appears 
that your conclusions are correct as far as your website goes.  It may very 
well be that there is a memory leak because of the way you are doing things.  
It is possible that performing recursion as you have with javascript in a 
webpage can cause the browser not to let go of previously used memory, because 
"it doesn't know any better".  This is a link to a thread that discusses this 
sort of phenomenon briefly: http://www.thescripts.com/forum/thread153454.html

The other thing that you're doing is considered "unsafe" in the world of 
programming, and so it is not considered normal if a computer does this.  Most 
operating systems are built to ensure that this doesn't happen, but I know that 
you can override that when compiling your kernel for Linux, for example, in 
order to minimze the amount of memory used by a single process.  I suppose that 
tweaks also exist for Windows to do the same thing, though I haven't looked 
into it.  In such a case, a process may be restarted in order to try and free 
up memory.

As far as clicking a lot affecting performance, I'm not sure what to think 
about it.  It does not seem likely that simply clicking would decrease 
performance as IE does not continuously withhold memory from the system, except 
when the web application is poorly programmed for handling memory, especially 
when using objects.  For example when you leave a website, the memory IE 
retained for that website's cache is released and the objects it was saving are 
linked to by creating shortcuts in the cache space.  This kind of perfomance 
problem would necessarily be caused by other programs running, or by some 
memory leak contained in the web site/ web application itself.

Thank you again for the information.  I am continuing to work through these 
issues.  Interestingly, another error I see in the same place every time is 
that about half of my level two links come back as null.  I think I know why, 
but I'm not sure yet.  I have already formed precautionary blocks around that 
section of code in order to keep it from killing the Spider ultimately, but it 
shows up in the log I create consistently.

Nathan
---------------------------------------------------------------------
Posted via Jive Forums
http://forums.openqa.org/thread.jspa?threadID=5183&messageID=14536#14536
_______________________________________________
Wtr-general mailing list
[email protected]
http://rubyforge.org/mailman/listinfo/wtr-general

Reply via email to