Artur,

I did some research this morning to unearth the latest best practices.  I'll 
share the results of my findings.

I found some information that suggests that Google is capable of 1) indexing 
Flash content embedded via SWFObject and 2) cataloging content linked via URLs 
with hash marks.  However, the majority view seems to be that the best way to 
control what Google catalogs is to provide distinct page links that do not use 
the hash mark. 

Google announced in mid 2008 that it could crawl Flash: "Now that we've 
launched our Flash indexing algorithm, web designers can expect improved 
visibility of their published Flash content, and you can expect to see better 
search results and snippets" ("Google learns to crawl Flash," 
http://googleblog.blogspot.com/2008/06/google-learns-to-crawl-flash.html, June 
30, 2008).  That same day, Google asserted, "We've improved our ability to 
index textual content in SWF files of all kinds. This includes Flash "gadgets" 
such as buttons or menus, self-contained Flash websites, and everything in 
between" ("Improved Flash indexing", 
http://googlewebmastercentral.blogspot.com/2008/06/improved-flash-indexing.html,
 June 30, 2008).  A year later, Google asserted in the blog post "Flash 
indexing with external resource loading" 
(http://googlewebmastercentral.blogspot.com/2009/06/flash-indexing-with-external-resource.html,
 June 18, 2009) that they "just added external resource loading to our !
 Flash indexing capabilities," meaning that "when a SWF file loads content from 
some other file—whether it's text, HTML, XML, another SWF, etc.—we can index 
this external content too, and associate it with the parent SWF file and any 
documents that embed it."

Regarding Google's capacity to see links with hash marks as unique URLS, I 
found an article in betanews from September 29, 2009 titled "Google vs. Yahoo 
vs. Bing on 'deep linking:' Does it make any difference?" 
(http://www.betanews.com/article/Google-vs-Yahoo-vs-Bing-on-deep-linking-Does-it-make-any-difference/1254260245)
 in which the author notes, "This week, all three of the world's top general 
search engines touted the addition of deep links to their search results, 
although Google has been actively experimenting with deep links since this time 
last year. The basic premise is this: For Web pages that have named anchors 
above selected subsections -- for example, <A NAME="Details"> -- the search 
engine is capable of generating subheadings in its search results that link 
users directly to those subsections, or at least to subsections whose titles 
imply they may have some bearing upon the query."  

With all that noted, I find it telling that Adobe is still pushing for basic 
URLs to page content rather than relying on Google to crawl hash marks or SWF 
content directly.  A little over a month ago, in the "Deep Links and Dynamic 
Content" video which is part of the Adobe Developer Connection article "Adobe 
Flash and search engine optimization (SEO): Techniques, issues, and strategies" 
(http://www.adobe.com/devnet/seo/articles/flash_seo_videos.html, December 14, 
2009), Damien Bianchi specifically asserts that "Google does not index anything 
past the hash mark in the deep linking URL" and recommends providing basic URLs 
for spider consumption.

Justin Everett-Church, senior product manager for designer/developer relations 
for Flash at Adobe, clearly articulated why we can't rely on Google's ability 
to crawl SWF content in an audio interview on December 9, 2009.  In this 
interview, Everett-Church noted, "Flash content or SWFs have been actually 
accessible to search engines for a while. In previous kind of incarnations, 
it's been able to decompile SWF and give all the strings out there. 
Unfortunately, that's not really getting out what an end user sees, what an end 
user experiences. So, we've had to come up with better solutions that give a 
more full description of the text links that are going on inside the SWF, how 
the end user actually is interacting with representing the hierarchy of the 
SWF. Without that full solution that we've implemented in the last couple of 
years, really, Flash search ability was less than it should be, but that's 
obviously why we did the work."

While there may come a day when we can structure our ActionScript code to 
precisely control what Google sees, for the time-being the approach outlined by 
Michael Wyszomierski and Greg Grothaus in an article titled "A Spider's View of 
Web 2.0" 
(http://googlewebmastercentral.blogspot.com/2007/11/spiders-view-of-web-20.html,
 Tuesday, November 6, 2007) seems to offer the best hope for insuring Google 
sees our sites the way our clients and visitors see our sites.  

Bottom line: provide plain links to HTML pages and redirect Flash-enabled user 
agents that visit those pages to a link with a hash that will allow leveraging 
swfaddress to deep-link to the proper Flash content.

I intend to keep an eye on the Adobe Search Engine Optimization Technology 
Center at http://www.adobe.com/devnet/seo/ to stay on top of what Adobe 
recommends on this front.

Thanks,
Raymond Simmons
Neon Sky Creative Media, Inc.

---- artur <ar...@artur.com> wrote: 
> was wondering if there are any bulletproof SEO solutions out there
> besides doing a mod re-write for crawlers.
> 
> does google still penalize for this?
> 
> thanks
> 
> artur
> 
> _______________________________________________
> Flashcoders mailing list
> Flashcoders@chattyfig.figleaf.com
> http://chattyfig.figleaf.com/mailman/listinfo/flashcoders


_______________________________________________
Flashcoders mailing list
Flashcoders@chattyfig.figleaf.com
http://chattyfig.figleaf.com/mailman/listinfo/flashcoders

Reply via email to