Hi all. Using ownCloud 3.0, I experienced problems when trying to share large files globally. The Web server just returned a 500, the Apache log revealed the underlying problem:
[Tue Mar 13 14:29:00 2012] [error] [client ...] PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 381095937 bytes) in /var/www/owncloud-3.0.0/lib/filestorage/local.php on line 62 It turns out that this is a common problem witn PHP's readfile() function, and the best way to handle it is to read the file in chunks. So I patched the local.php file, see attached. This approach works fine for me with small files (< $chunksize) and large files (380 MB). Maybe you might consider merging this into the master branch. Doing a quick grep through the source code, it seems that there exists the exact same problem in lib/filestorage/remote.php, but I don't know how and when the OC_Filestorage_Remote class is used and if it might ever happen that the remote file size will exceed the memory limit. -Christian
--- lib/filestorage/local.php.orig 2012-03-13 14:38:18.375771275 +0100
+++ lib/filestorage/local.php 2012-03-13 15:02:46.740763263 +0100
@@ -59,7 +59,23 @@
return file_exists($this->datadir.$path);
}
public function readfile($path){
- return readfile($this->datadir.$path);
+ // How many bytes to read at once
+ $chunksize = 1024 * 1024;
+ $size = filesize($this->datadir.$path);
+ $size_read = 0;
+
+ // Read the file contents chunk by chunk
+ $f = fopen($this->datadir.$path, 'rb');
+ while (!feof($f) && ($buf = fread($f, $chunksize)) !== FALSE) {
+ echo $buf;
+ ob_flush();
+ flush();
+ $size_read += $chunksize;
+ }
+ fclose($f);
+
+ // If no error occured, we have read the whole file
+ return ($buf !== FALSE) ? $size : $size_read;
}
public function filectime($path){
return filectime($this->datadir.$path);
signature.asc
Description: OpenPGP digital signature
_______________________________________________ Owncloud mailing list [email protected] https://mail.kde.org/mailman/listinfo/owncloud
