I'm trying to use curl in a script to download a CSV file nightly via FTP. Because of firewall issues, I need to download through a squid proxy.
This doesn't work: $ftpproxy = "http://192.168.0.1:3128"; # ... try { $ch = curl_init(); $url = "ftp://${ftpserver}/${filename}"; echo "Fetching $url\n"; curl_setopt($ch, CURLOPT_VERBOSE, true); curl_setopt($ch, CURLOPT_HTTPPROXYTUNNEL, true); curl_setopt($ch, CURLOPT_PROXY, $ftpproxy); curl_setopt($ch, CURLOPT_PROXYTYPE, CURLPROXY_HTTP); curl_setopt($ch, CURLOPT_SSLVERIFYPEER, false); curl_setopt($ch, CURLOPT_URL, $url); curl_setopt($ch, CURLOPT_USERPWD, "${ftpuser}:${ftppass}"); ob_start(); if (curl_exec($ch)) { curl_close($ch); $filecontents = ob_get_contents(); } else { echo "Error: failed to download ${filename}\n"; $error = true; } ob_end_clean(); unset($ch); if ($debug) echo "${filecontents}\n\n"; } This does (with FTP_PROXY set as an environment variable): try { $url = "ftp://${ftpuser}:[EMAIL PROTECTED]/${filename}"; $filecontents = `/usr/local/bin/curl ${url}`; if (preg_match('/<HTML>/i', $filecontents)) { throw new Exception("FTP Proxy error"); } if ($debug) echo "${filecontents}\n\n"; } I'd rather not have to invoke the separate curl process if I don't have to. Is there some obvious mistake in the first version of the script? _______________________________________________ UPHPU mailing list [email protected] http://uphpu.org/mailman/listinfo/uphpu IRC: #uphpu on irc.freenode.net
