You're right. In the case where account name is encoded into the URL and Google manages to index it somehow (i.e. there is no password required) then we'll definitely see problems.

Maybe we should just add BookmarkablePageRequestTargetUrlCodingStrategy.excludeArgument(String queryParameter) so you could for example invoke BookmarkablePageRequestTargetUrlCodingStrategy.excludeArgument("userName") and all other arguments would work fine.

Gili

John Patterson wrote:
Such pages MUST have different URLs but perhaps 90% of the content may be the same. Here is an example:

 /showRecord/id/123/userName/Gili
 /showRecord/id/123/userName/John

Every parameter is used to change the displayed page. These pages would show mostly the same content but would only differ in the user name displayed. Google would see these as duplicate pages but they *cannot* have the same URL.

If one used the form:

/showRecord?id=123&userName=John

Or even better:

/showRecord/id/123?userName=John

Then Google will recognise that the page is dynamic and will not penalise your site. The major search engines no longer have a problem indexing such pages. I now use this second hybrid form and traffic has returned to normal. From reading SEO news groups I know that others have had the same problem with duplicate content penalties after Google's "Jagger" update.

Hope this helps clarify the problem with passing parameters in the path.


-------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
_______________________________________________
Wicket-user mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/wicket-user

Reply via email to