>>> Mihai Preda <[EMAIL PROTECTED]> 22-Aug-00 1:05:14 PM >>>
>But, if I want to get 10 pages from a server, wouldn't it
>be more advantageous (for the server too) to get the 10pages
>at once (with persistent c.) and afterwards leave it in peace for
>300sec(30*10) rather than open a TCP con. and get a page
>every 30sec, 10 times?
Well... there is still the question of processor and disk i/o util
but...
>First, I'd like to know what is the consensus in this matter.
I think persistent connections are all right for robots.
Trouble is you can't gaurantee you're going to get one can you?
>Second, I'd like to know what do you think about this proposition,
> which is aimed to allow the use of persistent c. by robots:
>So, we are againt server-overload, and we allocate 30sec/page.
>But we allow to get a limitted number of pages (say, 5 or 10)
together
>through a persistent con., with the condition to leave the server
rest
>longer afterwards. If we get 5 pages at once, we won't ask anything
>from the same server for the next 30*5 seconds. What do you think?
I don't think that's necessary myself... if you get a persistent
connection you should be able to get 5-10 pages and then drop for 30
seconds.
Still... it may be worth doing... it would take a long time for rule
changes to percolate through to the real net (as it were).
Nic