Yes, please provide more info on your setup..

I will make one comment though - because I had this same issue with Solaris 8,
Apache 1.3.29, PHP 4.3.2, and SM 1.4.2 - if I enable memory_limit at compile
time for PHP, this seems to always cause me problems with large attachments.. I
would see log entries that said I had allocated more than 12MB of memory when
the script only wanted a few hundred K of memory. In testing I even tried a
limit of 512MB of RAM just for testing and it still failed.

Thinking this was related to php_accelerator (www.php-accelerator.co.uk), I set
the memory workaround flag to no avail. In fact, disabling php accelerator all
together STILL did not fix this problem. The only way I was able to get this to
work consistently was to build PHP *without* memory limits enabled.

Yes, it frustrates me too, but you still have the limit of how long a script is
allowed to run.

-- 
Chris Winterrowd
Unix E-mail Administrator
Texas Instruments Inc.
[EMAIL PROTECTED]

> I've searched the archives and double checked my php.ini settings. I have
>
> memory_limit = 12M
> file_uploads = On
> upload_max_filesize = 6M
> post_max_size = 8M
>
> and yes, I restarted httpd many times.
>
> Still, when I try to add an attachment that is over a couple hundred K, I
> get "File contains no data" error message. This pops up with only a second
> or two delay. It doesn't even try to get the file. Anything smaller works
> just fine.
>
> - Gary
>
>
>


-------------------------------------------------------
This SF.net email is sponsored by: IBM Linux Tutorials.
Become an expert in LINUX or just sharpen your skills.  Sign up for IBM's
Free Linux Tutorials.  Learn everything from the bash shell to sys admin.
Click now! http://ads.osdn.com/?ad_id=1278&alloc_id=3371&op=click
--
squirrelmail-users mailing list
List Address: [EMAIL PROTECTED]
List Archives:  http://sourceforge.net/mailarchive/forum.php?forum_id=2995
List Info: https://lists.sourceforge.net/lists/listinfo/squirrelmail-users

Reply via email to