ecc wrote:
> I really dont know, if this is the right board for this... but i think, i
> have to write it down.

if your looking to post a feature request then there is no list for that
- the bug database would be the place to do it ... but please don't bother the
devs with your feature request - it won't happen for reasons that are public 
record.

> 
> Please throw an error, if file_get_contents cant read a file because
> memory_limit overflow!

go read the archives of [EMAIL PROTECTED] for numerous reasons why that
it not going to happen.

> 
> I´ve programmed a tool parsing checksums from large files. Because i have to
> manage header-offsets, i have to read the files using file_get_contents.

given that your own comment in the example below states "while loop reading
the data! (This works!)" I can't see why you *have* to use file_get_contents()

have you tried filesize() and chcked the result is less than memory_limit?

> (This should be the fastest way for this task says the manual... right!)
> 
> The problem is, that if the memory_limit in the php.ini is set to 64MB, and
> the file read in by file_get_contents is bigger than 64MB, file_get_contents
> crashes without an EXCETPION. This crash cant be catched!

why do you think it should throw an exception? it won't, especially not a
particular class of exception that you made up o the spot.

php errors and exceptions are 2 different things, before going any
further you should read up on both so that you understand the difference.

> 
> I want handle this this way:
> 
> try{
>     $data = file_get_contents($file);
> }
> catch (MemoryLimitReachedException $e){
>     // try the same using a while loop reading the data! (This works!)
> }
> 
> Hope, there is a solution for this.
> 
> Thank you.
> Andreas
> http://www.php-gtk.eu/apps/emucontrolcenter
> 

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to