Try creating a file named robots.txt at your web-server root.  Put lines in
the file like so:

user-agent: *
disallow: /webapp

where "webapp" is the path to your web application.

Spiders and similar clients should read this file and follow the directives
there.
Look here: http://www.robotstxt.org/wc/robots.html for more info.

- Fernando



                                                                                       
                                                        
                    "David M.                                                          
                                                        
                    Rosner"              To:     [EMAIL PROTECTED]        
                                                        
                    <dave@recomme        cc:     (bcc: Fernando Salazar/CAM/Lotus)     
                                                        
                    nd-it.com>           Subject:     Preventing Proxy Servers From 
Hitting JSP Pages?                                         
                                                                                       
                                                        
                    06/28/2001                                                         
                                                        
                    02:44 PM                                                           
                                                        
                    Please                                                             
                                                        
                    respond to                                                         
                                                        
                    tomcat-user                                                        
                                                        
                                                                                       
                                                        
                                                                                       
                                                        




Hi All,

Every day I get hundreds of hits on my JSP pages from proxy servers who are

trying to determine if the links are still good. This is great for my
static pages, but on pages with forms and processing logic it causes havoc.

For example, if I have 3 pages of forms and the final page adds something
to my database, hitting just the second page throws errors.

I know that there is a pragma directive I need to add to each page, but
isn't there also something that can be added to the HTTP header each time.
And if so, what is the easiest way to add this to every outgoing header?

Thanks,

David M Rosner





Reply via email to