I've used both fopen() and the CURL libraries...
> > I need to grab some data from a CGI program running on a different
> > server. I have been toying around executing LYNX using the passthru
> > command. I seem to be able to grab the data just fine using "lynx
> > -source -preparsed $URL". I've started reading about CURL. I am
> > wondering if it is worth my effort to recompile my server to include
> > CURL support. Any feedback is appreciated.
>Why use curl (or an external program like lynx) when you can use
>file()/readfile()/fopen()/fsockopen(), depending if you just need the
>source from that cgi or need to mangle it a little?
>Using PHP build-ins will most likely be faster or at least nicer for
>IMHO (never used it) curl is only needed when you need to something
>fancy like post/put requests.
though i was just doing simple GETs to website, i happened to be doing a lot
of them in nested loops (get one page, parse a bunch of URLs out of it, get
a sub-page, find a link in it, get a file, etc...)
fopen()/fread() just started to crash after a couple of iterations. i don't
know if that's because i'm on win2k or what, but I enabled CURL and tried
using it instead, and NO PROBLEMS whatsoever. CURL turned out to be much,
much more reliable for my use.
just so's ya know,
Send and receive Hotmail on your mobile device: http://mobile.msn.com
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php