simply put sensitive files outside the web root and have CF pull them as
needed - no spider is going to index inaccessible files ;-)

my 2 cents in the 20 seconds I thought about this HTH ;-)

...and I'm sure you'll get loads of other options - this one to me is
simple and solid

Cheers

On Fri, 2012-08-10 at 14:02 -0400, Robert Rhodes wrote:

> Hello everyone.
> 
> I have a site where a password is required to access the site.  On pages in
> the site, there are links to download files.  I set the appropriate meta
> tags and robots.txt to tell the search engines to not spyder the site.
> 
> Though the site pages are not in google, the files are showing up.  that's
> bad.
> 
> It's a lot of files, so before I code up a solution to access all the
> through logic so I can control the permissions, is there some way to
> protect a directory so that files can't be downloaded without being logged
> in on the site?
> 
> My guess is the answer is no, but I thought I would ask.
> 
> -RR
> 
> 
> 

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~|
Order the Adobe Coldfusion Anthology now!
http://www.amazon.com/Adobe-Coldfusion-Anthology/dp/1430272155/?tag=houseoffusion
Archive: 
http://www.houseoffusion.com/groups/cf-talk/message.cfm/messageid:352094
Subscription: http://www.houseoffusion.com/groups/cf-talk/subscribe.cfm
Unsubscribe: http://www.houseoffusion.com/groups/cf-talk/unsubscribe.cfm

Reply via email to