On 2/21/07, Matt Chotin <[EMAIL PROTECTED]> wrote:
Quick little poll for you guys (written late at night, not validated by my bosses, please be gentle...). There has been plenty of discussion on this list and elsewhere about deep linking and SEO for Flex apps. Various solutions are available currently for deep linking like URLKit or SWFAddress. However a robust solution for SEO is not available, primarily because the search engine vendors need to cooperate a little bit
What could the search engine guys do to read the data presented in some flex field based on a database search. (the most popular suggestion these days is to return HTML instead of
a SWF when you think it's a search engine viewing your page, however this is known as "cloaking" and the search engines frown upon it; think about how porn sites operate and you'll know why).
This is not accurate. The search engines only consider this "cloaking" and/or frown upon this when there is deception. It is considered perfectly appropriate to have a page return data in different formats so long as the content is the same and there is no effort to trick the user or the search engines. This is not just me saying this, but is based on direct feedback from people at Google. Deep linking on its own of course has benefits, better URL support in
the address bar for bookmarking, copying into an IM or email, and even cleaner history management. But I would contend that most folks want deep linking so that pages can show up in a search engine, that while deep linking is nice, SEO is even more important. Q1: Do you think about deep linking primarily in the context of SEO, or do you often want deep linking for non-search-related tasks?
While I do not see deep linking as being only for SEO, I do think the solutions should be integrated. You can't do SEO without deep linking, and search is the primary revenue driver for the entire internet, so it is clearly far more important that deep linking by itself. We are currently investigating adding deep linking support into the Flex
framework (similar perhaps to URLKit, but more integrated, design still TBD). However I also want to get your opinion as to whether you think we should be investing here at this time, knowing that we may not have an SEO solution at the same time. Note that as I said, SEO also requires search engine help, so this is not a straightforward tradeoff. Q2: Would you have us invest in deep linking before SEO, or is it of more use to you if they come out together? Are current deep-linking solutions sufficient for you at this time?
I cannot imagine a scenario where these can be credibly separated. I think URLKit is fine for now if you are not solving the search addressability issue, so my vote would be for a comprehensive one. By the way, it sounds like you guys are considering a solution that is different from returning different data from the server if it is a search engine access - since you guys have been considering that "cloaking". I would really like to hear what the alternate strategy you guys are considering is that would allow database driven sites to work without returning alternate XHTML when hit by a search engine, since I am having a hard time imagining conceptually, how such a thing would work. Regards, Hank

