I just discovered that some code that I recently implemented is preventing bots (the good kind) from crawling our site. The reason for this is that my new code relies on cookies and bots do not support cookies. I have a language switcher that switches languages on a site. Since cookies to not take effect until the next page load, I immediately redirect after setting the cookie so that the user sees the effects of switching languages. To avoid setting the cookie on every page load, I check to see if the cookie exists before setting it. The problem is that browsers (and bots) that do not support cookies (or have them disabled) will never meet the criteria of the cookie being set and will be stuck in an infinite redirect loop. I have spent hours trying to come up with solutions to this problem and am still at a loss. Does anyone have any ideas how I can make this work?

This is the code that set's the cookie and does the redirect:

http://rafb.net/p/wsNBaW27.html

Attachment: smime.p7s
Description: S/MIME cryptographic signature

_______________________________________________

UPHPU mailing list
[email protected]
http://uphpu.org/mailman/listinfo/uphpu
IRC: #uphpu on irc.freenode.net

Reply via email to