Hi all,

Exhibit suffers from the same Achilles heel as other Ajax applications: 
the dynamic content that gets inserted on-the-fly is totally invisible 
to Google. My whole web site is now invisible to Google :-) Perhaps this 
is the biggest impediment to adoption.

Johan has added some code that allows Exhibit to load data from HTML 
tables. This lets your data be shown even if Javascript is disabled and 
lets your data be visible to Google. However, HTML tables are clunky to 
store data.

There is another alternative: inserting your data encoded as JSON 
between <pre>...</pre> and then getting Exhibit to grab that text out 
and eval(...) it. If Javascript is disabled, the data is displayed as 
JSON--not so pretty.

However, if the data is fed from another source, such as Google 
Spreadsheets, then neither of these approaches can be used.

We've also entertained the idea of using the browser's Save Page As... 
feature to snapshot a rendered exhibit and then using that as the public 
page. Exhibit still gets loaded into that page, but it would initially 
not change the DOM until some user action requires it to. However, the 
browser's Save Page As... feature doesn't do a very good job of saving 
the generated DOM.

So, I think anything we do would look pretty much like a hack and work 
for only some cases. We also risk getting blacklisted by Google's 
crawler. So, what do we do? Is it possible to ask Google to scrape those 
exhibit-data links in the heads of the pages? And how do we do that?

David

_______________________________________________
General mailing list
[email protected]
http://simile.mit.edu/mailman/listinfo/general

Reply via email to