[PHP] file_get_contents for URLs?

2009-04-07 Thread Skip Evans

Hey all,

I'm doing some maintenance work on an existing system and 
there is a piece of code that uses file_get_contents() to read 
data from a URL, which is fine in theory I suppose.


But the problem is sometimes the server where that URL lives 
is not available, and the system hangs indefinitely.


Shouldn't this be done with curl, and if so can it be done so 
that the call will time out and return control back when the 
server is not available?


Any other recommendations?

I just came across this code and it's one of the client's 
biggest complaints.


--

Skip Evans
Big Sky Penguin, LLC
503 S Baldwin St, #1
Madison WI 53703
608.250.2720
http://bigskypenguin.com

Those of you who believe in
telekinesis, raise my hand.
 -- Kurt Vonnegut

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] file_get_contents for URLs?

2009-04-07 Thread Jan G.B.
Well, you might want to do it with curl, you might want to write your
own socketscript, or your just check the return variable of
file_get_contents() - it'll be false on failure and it won't try to
get an invalid URL forever. Guess the error is somewhere else, when
your script continues indefinitely.
I'm using theis function in that way with a daily cronjob, and the
remote server isn't so stable... trust me. ;)

But setting the timeout can be done in php.ini or like suggested on
the php.net manual:
http://www.php.net/manual/en/function.file-get-contents.php#82527

byebye


2009/4/7 Skip Evans s...@bigskypenguin.com:
 Hey all,

 I'm doing some maintenance work on an existing system and there is a piece
 of code that uses file_get_contents() to read data from a URL, which is fine
 in theory I suppose.

 But the problem is sometimes the server where that URL lives is not
 available, and the system hangs indefinitely.

 Shouldn't this be done with curl, and if so can it be done so that the call
 will time out and return control back when the server is not available?

 Any other recommendations?

 I just came across this code and it's one of the client's biggest
 complaints.

 --
 
 Skip Evans
 Big Sky Penguin, LLC
 503 S Baldwin St, #1
 Madison WI 53703
 608.250.2720
 http://bigskypenguin.com
 
 Those of you who believe in
 telekinesis, raise my hand.
  -- Kurt Vonnegut

 --
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php



--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] file_get_contents for URLs?

2009-04-07 Thread Richard Heyes
 Hey all,

Hello.

 I'm doing some maintenance work on an existing system and there is a piece
 of code that uses file_get_contents() to read data from a URL, which is fine
 in theory I suppose.

 But the problem is sometimes the server where that URL lives is not
 available, and the system hangs indefinitely.

 Shouldn't this be done with curl, and if so can it be done so that the call
 will time out and return control back when the server is not available?

Looking at the docs alone, it looks like you can pass a stream as the
third argument to file_get_contents(). So create a stream, set the
timeout on that (using stream_context_create() 
stream_context_set_option() ), and then pass it to
file_get_contents().

-- 
Richard Heyes

HTML5 Canvas graphing for Firefox, Chrome, Opera and Safari:
http://www.rgraph.net (Updated March 28th)

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php