You are right that a spider will index SSL pages and you can set two
robots.txt files one for SSL and one for now ssl.
However, I'm not sure how self signed certificates are handled and how the
performance difference would affect results.

In other words, I was wrong. That said, I wouldn't risk it too much. If you
look at Google's search results, you will see that more often than not, non
SSL pages have a higher ranking than SSL pages. Google for your twitter
username for instance and see what result comes up.

I'm no SEO expert so take my advice with a huge grain of salt.

- Matt


On Tue, Oct 11, 2011 at 8:55 PM, Guyren Howe <[email protected]> wrote:

> On Oct 11, 2011, at 19:40 , Matt Aimonetti wrote:
>
> I would also not force SSL for any public pages for SEO reasons.
>
>
> Please elaborate. Why would encrypting the traffic between your server and
> Google (or whoever) affect your SEO rank? It's not like it would prevent
> anyone seeing the same page at the same URL.
>
> I wouldn't be surprised to see google *favor* pages with signed site
> certificates.
>
> Here:
>
>
> http://www.seochat.com/c/a/Search-Engine-Optimization-Help/SSL-https-Protocol-for-SEO-Tips/<http://www.webmasterworld.com/google/4293190.htm>
>
> is a discussion appearing to indicate that if properly implemented, SSL
> will not adversely affect SEO.
>
> For my part, I think it is a mistake that the entire internet isn't
> encrypted from the get-go. I don't use it routinely, but anything even
> trivially secure I just do via SSL as a matter of course. This is the first
> suggestion I've heard that this could adversely affect SEO.
>
> --
> SD Ruby mailing list
> [email protected]
> http://groups.google.com/group/sdruby
>

-- 
SD Ruby mailing list
[email protected]
http://groups.google.com/group/sdruby

Reply via email to