On 26/8/05 3:55 PM, "Bob Wyman" <[EMAIL PROTECTED]> wrote:

> Remember, PubSub never does
> anything that a desktop client doesn't do.

Periodic re-fetching is a robotic behaviour, common to both desktop
aggregators and server based aggregators. Robots.txt was established to
minimise harm caused by automatic behaviour, whether by excluding
non-idempotent URL, avoiding tarpits of endless dynamic links, and such
forth. While true that each of these scenarios involve crawling new links,
the base principle at stake is to prevent harm caused by automatic or
robotic behaviour. That can include extremely frequent periodic re-fetching,
a scenario which didn't really exist when robots.txt was first put together.

e.

Reply via email to