To complete my previous post: I made tiled version of the images i was
using with maketx.
It does not crash anymore.
I am now convinced there is an issue with the cache release function.
Why use an assert, not simply exit the loop ?
On 10/24/2012 05:24 PM, Michel Lerenard wrote:
Hi everyone,
I have a problem with an assert popping up in
ImageCacheImpl::check_max_mem.
On line 1952, ASSERT (full_loops < 100) becomes false.
I get this error on a scene with 4 meshes, having each 3 tiff maps
applied through different material channels. Each mesh has its own
maps => i have 12 maps total. ( file size varies between 16 and 70Mo.
Maps are 8k*8k)
Depending on how much cache i set, i get the error sooner or later.
If i set more than 1.5GB, which seems enough to load everything, there
is no problem. I tried setting the limit to 1000, thinking maybe there
was really a lot of tiles to free, but it crashes the exact same way.
I run several evaluations at the same time ( up to 24 ) through a
single texture system.
I made a test and restricted the number of evaluations to 1, and it
worked. It seems to me that with several evaluations running, we try
to free memory but we can't because everything is used. So I
uncommented the log that displays the size of the memory freed, and i
was suprised by the numbers :
Freeing tile, recovering 67108864
Freeing tile, recovering 67108864
Freeing tile, recovering 67108864
Freeing tile, recovering 201326592
Does this mean the file has no tile ? Or a single tile ? In these
cases, shouldn't the image bypass the cache ?
Am I using the library correctly or not ? I'm not sure...
Michel
_______________________________________________
Oiio-dev mailing list
[email protected]
http://lists.openimageio.org/listinfo.cgi/oiio-dev-openimageio.org