Antw: Re: Preventing Proxy Servers From Hitting JSP Pages?(Abwesenheitsnotiz)

2001-06-29 Thread Petra Hora

Ich bin bis 9.7.2001 auf Urlaub. Bitte wenden Sie sich in dieser Zeit an meine 
Kollegen im Team EW2

Mit freundlichen Grüßen 
Petra Hora



Re: Preventing Proxy Servers From Hitting JSP Pages?

2001-06-29 Thread Joe Laffey

> Hi All,
>
> Every day I get hundreds of hits on my JSP pages from proxy servers who are
>
> trying to determine if the links are still good. This is great for my
> static pages, but on pages with forms and processing logic it causes havoc.
>
> For example, if I have 3 pages of forms and the final page adds something
> to my database, hitting just the second page throws errors.
>
> I know that there is a pragma directive I need to add to each page, but
> isn't there also something that can be added to the HTTP header each time.
> And if so, what is the easiest way to add this to every outgoing header?

You can send out HTTP headers saying not to cache (PRAGMA: no-cache, etc),
but I think you way want to implement some other scheme. If your program
cannot handle random calls to ANY page then you may be in trouble.

I would suggest checking the REFERER header and redirecting clients to the
main page if the referer is NOT the page it is supposed to be. Note that
this is NOT a security measure (referer is easily forged), but it can help
with this situation, or people who have bookmarked pages.

You can also use a session ( either servlet sessions or one you devise
yourself) and embed that in urls (could even be as simple as the current
time in millis). Then if that sessesion has expired (or too much time has
passed in the case of the time thing) then you send the client to the main
page. Note that adding the current time to your requests pretty much makes
them non-cacheable anyway. Also note that most proxy servers will not
cache POST requests or GEt requests with the "?" char in them (i.e. with a
query string). By adding ?t=" you make it less likely to be
cached. Also, the browser itself will not cache the page between calls
because the time will be different each time it is generated.

Note that two clients COULD receive the same current time in millis due to
the threading issues of servlets. This is not a problem unless you make it
one.

Joe Laffey
LAFFEY Computer Imaging
St. Louis, MO
--
Need to do multi-file string replacement in Un*x, but don't want to mess
with sed? Try rpl. It's a free text replacement utility with source.
http://www.laffeycomputer.com/rpl.html  -- Check it out!
''+. .+'''+. .+'''+. .+'''+. .+''+.
\   /   \   /   \   /   \   /  \
 `+...+' `+...+' `+...+' `+...+'`+..





Re: Preventing Proxy Servers From Hitting JSP Pages?

2001-06-29 Thread Fernando_Salazar


Try creating a file named robots.txt at your web-server root.  Put lines in
the file like so:

user-agent: *
disallow: /webapp

where "webapp" is the path to your web application.

Spiders and similar clients should read this file and follow the directives
there.
Look here: http://www.robotstxt.org/wc/robots.html for more info.

- Fernando



   

"David M.  

Rosner"  To: [EMAIL PROTECTED]

   Subject:     Preventing Proxy Servers From 
Hitting JSP Pages? 
   

06/28/2001 

02:44 PM   

Please 

respond to 

tomcat-user

   

   





Hi All,

Every day I get hundreds of hits on my JSP pages from proxy servers who are

trying to determine if the links are still good. This is great for my
static pages, but on pages with forms and processing logic it causes havoc.

For example, if I have 3 pages of forms and the final page adds something
to my database, hitting just the second page throws errors.

I know that there is a pragma directive I need to add to each page, but
isn't there also something that can be added to the HTTP header each time.
And if so, what is the easiest way to add this to every outgoing header?

Thanks,

David M Rosner








Re: Preventing Proxy Servers From Hitting JSP Pages?

2001-06-28 Thread Dmitri Colebatch

I know a robots.txt in the root dir will stop spiders from doing this - not 
sure if proxies are intelligent enough to reuse that, but it might be worth a 
try.

cheers
dim

On Fri, 29 Jun 2001 04:44, you wrote:
> Hi All,
>
> Every day I get hundreds of hits on my JSP pages from proxy servers who are
> trying to determine if the links are still good. This is great for my
> static pages, but on pages with forms and processing logic it causes havoc.
> For example, if I have 3 pages of forms and the final page adds something
> to my database, hitting just the second page throws errors.
>
> I know that there is a pragma directive I need to add to each page, but
> isn't there also something that can be added to the HTTP header each time.
> And if so, what is the easiest way to add this to every outgoing header?
>
> Thanks,
>
> David M Rosner



Preventing Proxy Servers From Hitting JSP Pages?

2001-06-28 Thread David M. Rosner

Hi All,

Every day I get hundreds of hits on my JSP pages from proxy servers who are 
trying to determine if the links are still good. This is great for my 
static pages, but on pages with forms and processing logic it causes havoc. 
For example, if I have 3 pages of forms and the final page adds something 
to my database, hitting just the second page throws errors.

I know that there is a pragma directive I need to add to each page, but 
isn't there also something that can be added to the HTTP header each time. 
And if so, what is the easiest way to add this to every outgoing header?

Thanks,

David M Rosner