On Fri, 10 Mar 2000, Marc Slemko wrote:

> Suppose I have a site.  A fairly large and popular site that is some sort
> of message board type site, with several million or so unique and
> legitimate messages.
>
> Suppose the URLs for the messages are all in the form
> http://site/foo/showme.foo?msgid=6666 where 6666 identififes the message.
>
> Suppose I want common robots to index it, since the messages do
> contain useful content, so it is to their advantage because it
> gives them real and useful content that gives them better results
> than other engines that don't index them, and to my advantage to
> have it indexed since it brings people to the site.

I am guessing that these generated-on-the-fly webpages do not have
changing content. So why don't you have some programmer write a quick
script to locally fetch every one of these pages and resave them as plain
old static html web pages? Your webserver will have a lot less work to do
and your pages can all be indexed by the major search engines.

If the "msgid" (in your example) has a standard format, this should be
quite trivial. Someone could write a routine to do this in just a few
minutes. (Of course, it may take hours to run and to verify the results.)
Plus you could automate the routine to run weekly to convert new dynamic
webpages to static.

  Jeremy C. Reed
  http://www.reedmedia.net
  http://bsd.reedmedia.net

Reply via email to