On Wed, 2007-08-29 at 13:47 -0600, Rusty Keele wrote:
> So,
>             At Sheri's presentation last week she mentioned that Google
> doesn't do very well with links that pass more than one variable.  For
> example: website.com?val1=98&val2=13  I have also found that when I try
> to validate any XHTML code where I have links like this I always get an
> error.  Can somebody explain this to me?  Is there some XHTML standard
> that prohibits using more than one $_GET variable, or is it just good
> coding practice to avoid multi-variable link references?

In my opinion $_GET variables should only be used when the value is
truly variable.  Examples of it's use could be search queries.

I am a big fan of what are called "clean URLs", or URLs without the ?=&
gobbledygook.  What looks better to you?

  http://www.example.com/index.php?action=viewPage&page=productPage
or
  http://example.com/products

Why not please both Google *and* people by switching to clean URLs.

The best way to implement clean URLs would require some Apache
and/or .htaccess hackery with either mod_rewrite, ErrorDocument, or
ForceType stuff.  There are lots of examples in the world of all of
that.  However, if none of those will work for your situation (perhaps a
less than cooperative host).  You can just use the pretty parameters
after the page name itself.

  http://example.com/site.php/products

This will work on almost all web hosts.  In this example your site.php
page will not read the vars via $_GET but via $_SERVER['PATH_INFO'].
It's not quite as pretty as if you used some .htaccess trickery, but
still a big step in the direction towards Google and People
friendliness.

--lonnie


_______________________________________________

UPHPU mailing list
[email protected]
http://uphpu.org/mailman/listinfo/uphpu
IRC: #uphpu on irc.freenode.net

Reply via email to