Hi

I'm looking for the fastest way to fetch multiple pages from php. 
Currently I'm trying the curl-lib and it works fine - I just don't 
understand how to use curl that it stores multiple requests in multiple 
files.
I'll try to explain my problem more detailed:
Somewhere are the pages
http://foo.bar.com/page1.html
http://foo.bar.com/page2.html
http://foo.bar.com/page3.html
I want to fetch them with php and write them locally to the files
foobarpage1.html
foobarpage2.html
foobarpage3.html
And I try to achieve a maximal speed.
I tried following code:

$fh1 = fopen("foobarpage1.html", "w");
$fh2 = fopen("foobarpage2.html", "w");
$fh3 = fopen("foobarpage3.html", "w");

$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://foo.bar.com/page1.html";);
curl_setopt($ch, CURLOPT_FILE, $fh1);
curl_exec($ch); //exec1
curl_setopt($ch, CURLOPT_URL, "http://foo.bar.com/page2.html";);
curl_setopt($ch, CURLOPT_FILE, $fh2);
curl_exec($ch); //exec2
curl_setopt($ch, CURLOPT_URL, "http://foo.bar.com/page3.html";);
curl_setopt($ch, CURLOPT_FILE, $fh3);
curl_exec($ch);
curl_close($ch);

fclose($fh1);
fclose($fh2);
fclose($fh3);

I just tells me that it's not capable of executing one handler multiple 
times. If I delete the lines with exec1 and exec2 it works but only 
stores the last page...

Who can help?

TIA & greets
Stefan Rusterholz


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to