CLIENT-CERT over secure and non-secure connectors

2002-12-17 Thread Michael Yates
Hi all,

I have an unusual set-up/configuration question.

I wish to have a single instance of a web-app accessible over both http and
https (with the https users authenticating with client certificates). The
reason for this configuration is that the un-secure port may be handling
traffic coming over (say) a VPN - which already has all of the security
required. Whereas the secure port may be more open and available to the
general public.

However if I add
auth-methodCLIENT-CERT/auth-method
Along with the other necessary security setup stuff in my web-app web.xml
file it uses the SSLAuthenticator valve when processing both the HTTP as
well as the HTTPS requests. Meaning traffic coming over the standard HTTP
gets stopped with errors like no certificate chain

Can anyone see any way to have the one web-app require client-certification
when the user comes over HTTPS but allow them access when they come over
HTTP?

Regards,
Michael Yates
Software Engineer
Australia (Wollongong) RD
[EMAIL PROTECTED]
ESN 639-7547 Direct +61 2 42547547



http/1.1 pipelined request processing order

2002-12-01 Thread Michael Yates
Hi all,

From some testing I have done it appears Tomcat ensures that pipelined
requests (HTTP/1.1) are handled in order by only handing off request #2
after request #1 has completely finished processing. This adds quite a delay
in processing a sequence of lengthy requests.

Say 2 requests arrive in a HTTP/1.1 pipeline very close together. And each
request takes 10 seconds to process.
The behavior I have seen is that:
* Request 1 is handed to the servlet and allowed to process 
* Response 1 is written out on the wire
* Request 2 is handed to the servlet to process
* Response 2 is written out on the wire.

This takes a total of just over 20 seconds. 

However if the client had NOT used pipelining (which should be more
efficient) and opened two connections to the server then request 1 and
request 2 would have both been processed in a total (start to end time) over
just over 10 seconds - although using more sockets and more packets.

Is there a way Tomcat can be configured whereby as soon as requests arrive
they are handed to the servlet for processing? (Obviously as separate
threads).

Has anyone written any custom code to ensure the responses going back out on
the wire are in the same order as the requests coming in (as is required in
HTTP/1.1).

If this functionality isn't currently implemented in Tomcat 4 where would be
the best place (in the code) to go about adding this for our custom
solution?

Regards,
Michael