Greetings!

I'd like to request this as a new feature, but hopefully I'm mistaken and it
already is a feature.

Something that I have been wondering about as my document total gets larger
is why doesn't aspseek automatically delete or get rid of bad URLs?

For example, let's say I index 10,000 URLs but 1,000 of these are bad (site
doesn't exist).  Well the way I see my aspseek performing on my server,
aspseek constantly goes back and tries to index these 1,000 URLs.  This
isn't such a big deal now, but what if I had 500,000 URLs and 50,000 were
bad?  50,000 attempts to reindex bad URLs every time you run index seems
like a waste of resources, bandwidth, time, etc.  Can't aspseek keep track
of the fact that it couldn't retrieve pages for the URL after n tries and
simply delete the URL from its ToDo list?

Thanks in advance.

John





----- Original Message -----
From: "Kir Kolyshkin" <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Wednesday, June 20, 2001 2:48 AM
Subject: Re: [aseek-users] total pages indexed?


> Lee Bolding �����(�):
> >
> > does anybody know how I can tell how many pages in total my aspseek has
> > indexed?
> >
> > I know that I can do a "select count(*) from sites" to find the total
number
> > of domains, but is there a similar trick for pages?
> >
> > Thanks...
>
> Look into aspseek's var "total" file.
>
> --  [EMAIL PROTECTED]  http://kir.sever.net   ICQ 7551596  --
> Bend the facts to fit the conclusion. It's easier that way.
> --  |_ | |\| |_| ><   --  |_| |\| | ><   --   | ) |\/|   --

Reply via email to