Hi all,

I'm stuck with a wee search engine problem. I have a site that was once
on domain x, with old-fashioned URLs, and is now on domain y with nice
semantic URLs. The content on the new site is largely different, but
there's a PHP-based lookup of the old URL patterns that will redirect
people to an equivalent page (where one exists) on the new site with a
301 redirect.

I issue a straight 301using a rewrite rule from domain x to the same
path on domain y to kick things off.

Now the problem I'm having is that for pages that don't have a "legacy
redirect" in PHP, they hit the old URL on domain x, get a 301 to the
same URL on domain y, but then receive a 410 page because there is no
redirect available, but it recognises the old URL pattern as having been
valid before. Humans deal with this well, because the end result is a
nice page explaining how to search for what they want, etc, but search
engines (at least Ask and MSN) have been continuously trying to hit the
old URL on domain x for over a year now, even though it 301s to a 410.

I'm trying to find a solution to make those search engines stop trying
to index the old URLs. I recall that I've seen that mod_rewrite can do a
proxy-check of the destination URL before it redirects, and bail out of
the rule if the client would be redirected to a non-200, but I can't
find this in the redirect documentation at a glance...

I guess another option would be a robots.txt that serves up just for
domain x that disallows the search engines on that domain entirely?

Any ideas?


Neil

--~--~---------~--~----~------------~-------~--~----~
NZ PHP Users Group: http://groups.google.com/group/nzphpug
To post, send email to [email protected]
To unsubscribe, send email to
[EMAIL PROTECTED]
-~----------~----~----~----~------~----~------~--~---

Reply via email to