Do you mean protect the actual html that generates the site from being
saved?  or protecting static content from being able to be hotlinked?
or preventing scraping your site's content for resyndication
somewhere?

Basically, if it is published on the web, it can be saved.  Sites
disable the right mouse-click with javascript, but, most people can
use keyboard shortcuts or tell wget to use a useragent string.  You
could enforce some sort of cookie/session mechanism to make it
difficult for them to store urls to come back to respider, but, a
smart fetch agent would just respider the site.  By preventing the
content from being stolen, I'm guessing you don't want the search
engine bots to hit the site either as they could cache the files, so,
a robots.txt file with Disallow: *  would prevent the honest bots from
spidering/caching the content.

You might be able to replace render with some obfuscation function
that perhaps minifies/obfuscates the html/css/js content.  Perhaps if
you could rely on surfers having javascript, you could use ajax and
dom js to generate the page.

However, no matter what you put on the page, a browser has to be able
to render it.  If a browser can render it, the content can be viewed.
Even Firefox with the developer toolbar has 'View Generated Source' so
that a page that contains quite a bit of javascript that modifies dom
units can be seen.

All of this and you might just end up making the site less usable for
honest surfers.

-- 
You received this message because you are subscribed to the Google Groups 
"pylons-discuss" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/pylons-discuss?hl=en.

Reply via email to