Is there anyone out there that has done this, or even might make some
suggestions.

With a few of our sites, I have noticed that theses search engines come
crawling and have been generating a few errors on pages that require
cookies, and certain url variables.

The url variables I have taken care of, and this is an inherited website so
I might need to look at recoding if it is warranted to do so.

The problem is that the logs are showing a lot of hits from these bots, when
these pages are rejecting anything that requires cookies. A page is
currently displayed telling the user to have cookies enabled and how to do
this.

My question is, what you guys doing or done about something like this or
maybe a suggestion in the right way to go about this and how do these search
engines cope with pages like this that tell users that they need to switch
cookies on.

 
Regards
Andrew Scott
Technical Consultant

NuSphere Pty Ltd
Level 2/33 Bank Street
South Melbourne, Victoria, 3205

Phone: 03 9686 0485  -  Fax: 03 9699 7976



---
You are currently subscribed to cfaussie as: [email protected]
To unsubscribe send a blank email to [EMAIL PROTECTED]
Aussie Macromedia Developers: http://lists.daemon.com.au/

Reply via email to