Thanks it's working. That is simpler than what I thought but I didn't fully
understand the new request engine.

I updated the wiki.

By the way, the redirection is another important point for SEO optimization
with wicket 1.5. Wicket should never perform a redirect when a robot ask for
a page as it can't follow the redirect.

I resolved it using a SEORequestMapper that delegates to the systemMapper
for all request except is the user-agent is a bot. In that case I block the
redirect.

Do you think, it's the good solution?

public class SEORequestMapper implements IRequestMapper {

    private final IRequestMapper delegate;

    public SEORequestMapper(IRequestMapper delegate) {
        this.delegate = delegate;
    }

    public IRequestHandler mapRequest(Request request) {
        IRequestHandler requestHandler = delegate.mapRequest(request);
        BotUtil botUtil = new BotUtil();
        if(botUtil.isBotAgent() && requestHandler instanceof
IPageRequestHandler) {
            IPageClassRequestHandler handler = (IPageRequestHandler)
requestHandler;
            return renderPageWithoutRedirect(handler.getPageClass(),
handler.getPageParameters());
        }
        return requestHandler;
    }

    private IRequestHandler renderPageWithoutRedirect(Class<? extends
IRequestablePage> pageClass, PageParameters pageParameters) {
        PageProvider pageProvider = new PageProvider(pageClass,
pageParameters);
        pageProvider.setPageSource(Application.get().getMapperContext());
        return new RenderPageRequestHandler(pageProvider,
RenderPageRequestHandler.RedirectPolicy.NEVER_REDIRECT);
    }

... delegation of other methods.

2011/10/11 Martin Grigorov <mgrigo...@apache.org>

> protected WebResponse newWebResponse(final WebRequest
> webRequest,             final HttpServletResponse httpServletResponse){
> return
> new ServletWebResponse((ServletWebRequest)webRequest,
> httpServletResponse) {
>
>           @Override
>           public String encodeURL(CharSequence url)
>           {
>               final String agent = webRequest.getHeader("User-Agent");
>               return isAgent(agent) ? url : super.encodeURL(url);
>           }
>       };}
>
> Please update the wiki page if that works.
> Thanks!
> On Mon, Oct 10, 2011 at 11:34 PM, Gaetan Zoritchak
> <g.zoritc...@moncoachfinance.com> wrote:
> > Hi,
> >
> > Removing the jsessionid for scrawling robots is important to avoid the
> > duplicate content problem.
> >
> > The solution proposed in the wiki
> > https://cwiki.apache.org/WICKET/seo-search-engine-optimization.html is
> not
> > working anymore. Does anyone knows the new way of doing it in wicket 1.5?
> >
> > Thanks in advance,
> >
> > Gaetan,
> >
>
>
>
> --
> Martin Grigorov
> jWeekend
> Training, Consulting, Development
> http://jWeekend.com
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: users-unsubscr...@wicket.apache.org
> For additional commands, e-mail: users-h...@wicket.apache.org
>
>

Reply via email to