Torsten Förtsch wrote:
On 11/23/2012 06:39 PM, André Warnier wrote:
1) is that "early enough" to be before the Apache ProxyPass step ?
yes, the proxy works as usual in the response phase.
2) can I set the "(corresponding hostname)" above in such a Perl
handler, or otherwise manipulate the URI before it gets proxy-ed ?
yes
3) do I need this ProxyPass directive in my configuration, or can I just
set the Apache response handler to be mod_proxy_http, in one of the Perl
handlers ? and if yes, how ?
no, the fixup handler can configure the request to be handled by mod_proxy.
# Perhaps 2 is better here, since your server actually acts as
# reverse proxy. The differences are subtle. Have a look at the code
# for better understanding.
$r->proxyreq(1);
$r->filename("proxy:".$url);
$r->handler('proxy_server');
That's what I would begin with. $url is then 'http://www.site-...'.
Many thanks. I would not have found this by myself. Of course, now that I know ..
http://perl.apache.org/docs/2.0/api/Apache2/RequestRec.html#C_proxyreq_
but I would not have looked there.
A problem here is perhaps if the user wants to use https.
Yes, that's the next issue :
5) https://www.site-5.edu
You can of
course fake up a certificate. You perhaps can even have the browser
accept it without warning. But nevertheless, a user with sufficient
knowledge can identify the man in the middle.
That last is not really an issue here. The users are (supposed to be)
cooperative .
The real context is that the client/company has subscriptions with certain sites, limited
in number of users and nominative. If the user is not registered yet, the message they
receive is to get in touch with the appropriate department, to obtain a subscription.
If they are registered, then they can be passed through to the target
transparently.
The reason for being able to manipulate the URI's, is that on some of these sites the
subscriptions are only valid for a part of the site.
Also, I don't necessarily want to do all this heavy-duty stuff for all calls to the
external sites (e.g. not for css, images etc..). That's why I wanted to achieve this with
mod_perl, to retain a maximum of flexibility. Once you start combining a bunch of
criteria on multiple target sites, things get pretty messy with standard Rewrite* and
Proxy* and SetEnvIf..
The browser warning may be an issue because it's ugly from a user point of view, and we
would have to warn them to disregard it. It may even be that they are not allowed to
disregard it, if their corporate IE + policies allow to set this.
In this case, there are 10 sites in total, and one of them is a https:// one. But there
may be more in future.
I'll cross that bridge when I get to it.
See also
http://foertsch.name/ModPerl-Tricks/using-modproxy-as-useragent.shtml
for a different but also unusual usage of mod_proxy.
Now that I read this, I remember seeing that before. But I forgot about it.
Thanks for reminding me.