|
this is really strange im getting bufferunderflow errors, for 300k
stream files but our 768k stream files are ok, some dont even stream at
all just hang. this is with and without a cache. Mondain wrote: Basically it will try to first allocate the amount of space needed to read the entire file. This is tune-able as far as using heap buffers or direct buffers go. Please note that this "fix" is not currently in RC1 though as I changed the handling after the release. The code in 0.6 previous to my fix attempted to "wrap" the bytes from the supplied file input stream without letting the implementer select heap or direct, nor did the code first attempt to allocate space for the file being read. What is the requested strategy for these ginormous files? With files over two gigabytes we are going to have issues in Java... |
_______________________________________________ Red5 mailing list [email protected] http://osflash.org/mailman/listinfo/red5_osflash.org
