On Apr 11, 2006, at 6:40 PM, Nathan Hamblen wrote:
It looks like Wicket is doing all the right things. I don't think it
should try too hard to avoid generating a session for bookmarkable
links
cause the googlebot could always find those same links after it has
created a session elsewhere, muddying things just as much.
Not really - the correct action would be to use a robots.txt to
prevent the spider from going to pages which require a session in the
first place.
Instead, search conscious sites (and what big site isn't?) should just
disable url rewriting in a WebRequest subclass. Users without cookies,
and Google, will still be able to browse the non-interactive parts of
the site through bookmarkable urls. You'd have to hack essential
(search) forms, but that's no harder than doing the same form in a
caveman framework. Then you'd customize the session expired page to
explain that cookies are required for interactivity.
Okay, but I don't think you should require cookies just because you
want to be search-engine friendly. URL rewriting is a great and
useful feature. I don't see why we should give it up if we don't
have to.
The question is really whether it is *possible* for wicket to defer
creating a session. If it is possible, then let's defer it. If not,
then we have no choice but to use workarounds like this. But let's
be realistic - these *are* workarounds.
-------------------------------------------------------
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the live webcast
and join the prime developer group breaking into this new coding territory!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=110944&bid=241720&dat=121642
_______________________________________________
Wicket-user mailing list
Wicket-user@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/wicket-user