ID: 35005 User updated by: daniel at polkabrothers dot com Reported By: daniel at polkabrothers dot com Status: Open Bug Type: Network related Operating System: Mac OS X 10.4.2 PHP Version: 5.0.5 New Comment:
I should probably add that I've tried running this both as root (su root; ulimit -n 5000) and using sudo (sudo php ...). Same result. Previous Comments: ------------------------------------------------------------------------ [2005-10-27 23:06:30] daniel at polkabrothers dot com I used ulimit -n to increase the number of allowed open files, otherwise it wouldn't even allow me to create 3000 files. Now ulimit -a gives me: core file size (blocks, -c) 0 data seg size (kbytes, -d) 6144 file size (blocks, -f) unlimited max locked memory (kbytes, -l) unlimited max memory size (kbytes, -m) unlimited open files (-n) 10240 pipe size (512 bytes, -p) 1 stack size (kbytes, -s) 8192 cpu time (seconds, -t) unlimited max user processes (-u) 100 virtual memory (kbytes, -v) unlimited Can't find anything else which relates to file descriptors and Mac OS X. ------------------------------------------------------------------------ [2005-10-27 22:56:55] [EMAIL PROTECTED] Looks like MacOSX has max number of file descriptors set to 1024 or something like that. I don't have MacOSX around here, but I guess this fact should be documented somewhere @ apple.com. Could you check it? ------------------------------------------------------------------------ [2005-10-27 22:50:34] daniel at polkabrothers dot com Have now done a bit more testing, and it only happens if you try to open more than 1017 files and then try to open a url. Have tried opening urls with fopen(), curl_* and exec ("wget"). Same end-result, they don't connect. PHP doesn't generate any error messages when trying to open using fopen(). When trying it with the curl functions, curl returns with "couldn't connect" but if you turn on more debugging it comes back with "Unknown error: 0". When trying to exec() wget it stops as soon as it gets a connection and is about to output "200 OK" (i have read the how to report bugs, but can't find what i'm missing to include) ------------------------------------------------------------------------ [2005-10-27 22:42:02] [EMAIL PROTECTED] Not enough information was provided for us to be able to handle this bug. Please re-read the instructions at http://bugs.php.net/how-to-report.php If you can provide more information, feel free to add it to this bug and change the status back to "Open". Thank you for your interest in PHP. ------------------------------------------------------------------------ [2005-10-27 22:31:53] daniel at polkabrothers dot com Description: ------------ When opening a lot (3000 in this case) files under Mac OS X, network connectivity disappears. This has been tested under Linux 2.6, and works fine. Reproduce code: --------------- $fp = array(); for($x=0;$x<3000;$x++) { $fp[$x] = fopen("/tmp/$x", "w"); } $url_fp = fopen("http://www.google.com", "r"); var_dump(fread($url_fp, 1500)); Expected result: ---------------- To get the first 1500 bytes from www.google.com Actual result: -------------- string(0) "" ------------------------------------------------------------------------ -- Edit this bug report at http://bugs.php.net/?id=35005&edit=1