I'm getting quite a few sites to retrieve part of their content from a second site. Although there is a fair amount of code sharing in the clients, I still end up with multiple copies of the calling code so to simplify things I just want to pass the url to the central server, have it do the right thing then pass back a fragment of html I can paste directly into the site output.
Example: <?php // File: universal_footer.inc // Anything in this file gets added to all page footer menus of // all non-steam-engine sites $server=$_SERVER["HTTP_HOST"]; $page=$_SERVER["REQUEST_URI"]; if( substr( $page,0,10 )== '/index.php' ) $page='/'; // <-- I really want to lose this line print "<br/>".file_get_contents ( " http://whatever.example.com/get.php?domain=" . urlencode($server) . "&path=" . urlencode($page) . "&block=9&format=pipe" ) . "\n"; The central server can then retrieve the client server & URI from the $_REQUEST (or $_GET) array. I already have 4 very similar fragments & am about to create another 7 or 8 so I want to keep this simple for the client. I'd prefer to convert it back to my original one-liner: <?php print "<br/>".file_get_contents ( " http://whatever.example.com/get.php?domain=" . urlencode($_SERVER["HTTP_HOST"]) . "&path=" . urlencode($_SERVER["REQUEST_URI"]) . "&block=9&format=pipe" ) . "\n"; ?> To do this I want to have the central server take over responsibility for removing unhelpful parameters (e.g. sid=) from the passed in URI, retaining only the interesting ones. If this was a direct call to the central server I could simply use $_GET / $_REQUEST and copy interesting items out of it. I'm thinking of passing $_SERVER["QUERY_STRING"] through as an extra url parameter from the client and using parse_str( $_REQUEST['qs'], $parsed ) server side then picking interesting parameters out of $parsed[]. Is there any functional difference between passing $_SERVER["QUERY_STRING"] through or just using the "query" result from a server-side parse_url()? Has anyone tried doing anything like this before? Does my approach sound reasonable & are there any gotchas I should be looking for? The thought's occurred to me that rather than splitting the client's $_SERVER["REQUEST_URI"] using parse_url() server-side, I could use the client's $_SERVER["SCRIPT_NAME"], except that http://www.example.com/ and http://www.example.com/index.php apparently both produce the same $_SERVER["SCRIPT_NAME"]. Is there anything simple I can use to reliably do this client-side? Thanks Bruce PS - My security model is the central server doesn't trust the clients and validates its input but the client sites 100% trust the central server - I guard against XSS & injection attacks in the central server so clients really can just dump the output to the http output stream B -- Bruce Clement Home: http://www.clement.co.nz/ Twitter: http://twitter.com/Bruce_Clement Directory: http://www.searchme.co.nz/ "Before attempting to create something new, it is vital to have a good appreciation of everything that already exists in this field." Mikhail Kalashnikov -- NZ PHP Users Group: http://groups.google.com/group/nzphpug To post, send email to [email protected] To unsubscribe, send email to [email protected]
