Edit report at https://bugs.php.net/bug.php?id=62409&edit=1

 ID:                 62409
 Updated by:         pierr...@php.net
 Reported by:        emmet at trovit dot com
 Summary:            Using cURL multi to save urls to a file then reading
                     files - files chopped
-Status:             Open
+Status:             Not a bug
 Type:               Bug
 Package:            cURL related
 Operating System:   Mac OS X 10.6.8
 PHP Version:        5.3.14
 Block user comment: N
 Private report:     N

 New Comment:

Thank you for taking the time to write to us, but this is not
a bug. Please double-check the documentation available at
http://www.php.net/manual/ and the instructions on how to report
a bug at http://bugs.php.net/how-to-report.php

This is not a bug. As noticed you need to fclose the file handle before trying 
to 
read the content of the file.


Previous Comments:
------------------------------------------------------------------------
[2012-06-25 16:43:29] emmet at trovit dot com

Turns out that everything works as expected when you close the original 
filehandle just before calling file_get_contents() on the same file. So this is 
not a bug, more of an 'undocumented misbehaviour'.

I've pasted the code which works here:
http://pastebin.com/kzVMAjdJ

------------------------------------------------------------------------
[2012-06-25 15:41:11] emmet at trovit dot com

Description:
------------
I'm using curl multi to download files in parallel. In the test script I 
specify that I want to put the contents of the download in a file using the 
CURLOPT_FILE flag. The file is downloaded correctly, but when I try to 
file_get_contents or fopen() on the downloaded file the contents are always 
chopped off. If I put the contents of the file in a variable and count the 
length of the string it is always 40960 characters max. The file itself 
downloads entirely to disk, it's just PHP which won't read it all, and limits 
it to 40Kb.

Tested with PHP 5.3.8, 5.3.13 and 5.3.14, the same thing happens with all 
versions.

Output of php -v:
PHP 5.3.8 (cli) (built: Dec  5 2011 21:24:09) 
Copyright (c) 1997-2011 The PHP Group
Zend Engine v2.3.0, Copyright (c) 1998-2011 Zend Technologies
    with Xdebug v2.1.2, Copyright (c) 2002-2011, by Derick Rethans

PHP 5.3.13 (cli) (built: May  9 2012 07:21:29) 
Copyright (c) 1997-2012 The PHP Group
Zend Engine v2.3.0, Copyright (c) 1998-2012 Zend Technologies
    with Xdebug v2.1.3, Copyright (c) 2002-2012, by Derick Rethans

PHP 5.3.14 (cli) (built: Jun 25 2012 16:47:44) 
Copyright (c) 1997-2012 The PHP Group
Zend Engine v2.3.0, Copyright (c) 1998-2012 Zend Technologies

This bug not quite the same as this one https://bugs.php.net/bug.php?id=52558 
but they could be related.

Test script:
---------------
I've put the code on pastebin, I can upload it again or email it if needed:
http://pastebin.com/mMHVK85Z

Expected result:
----------------
The script should echo out the entire contents of the newly downloaded webpage 
(which has been put in a local file).

Actual result:
--------------
Actual result is the first 40960 characters are output. Checking the file 
manually with a tail -f will show that the file was downloaded entirely, but 
PHP won't display it all.


------------------------------------------------------------------------



-- 
Edit this bug report at https://bugs.php.net/bug.php?id=62409&edit=1

Reply via email to