My suspicion is that this is related to a growing number of files held open by 
the underlying ImageCache, which will not be immediately closed or free just 
because you clear() or even destroy the ImageBuf. Or possibly by the overhead 
of what happens when the maximum number of open files is reached?

So, first, let's increase the number of files that can be held open:

    # Do this before opening any files...
    cache = oiio.ImageCache.create (shared=True)
    cache.attribute ("max_open_files", 500)

The default is 100, I'm curious what happens if you just raise that. On 
Linux/OSX it should be safe to go into thousands, but Windows usually has some 
lower limit, to let's try 500 just to see what happens to your timings. In 
particular, does the slowdown seem to come later, i.e. after more files have 
been touched?

Next, what I think you want is to let the cache know that it's safe to close 
the files when you won't need them for a while.  Keep that cache reference 
(above) around, and try this modification:

def get_resolution_from_attr(file):
        """Getting the resolution from an attribute. Yes there are better ways 
to get resolution :)"""
        buf = oiio.ImageBuf(file)
        spec = buf.spec()
        xres = spec.get_attribute('XResolution')
        yres = spec.get_attribute('YResolution')
        cache.invalidate (file)    #  <---- new thing here
        return file, xres, yres

"Invalidation" isn't quite what we're after, but it'll do for now. What we 
really want is to be able to tell the cache to "close" the file, keep anything 
currently in cache for it valid, but just close the file and free resources 
associated with holding it open. But that API call doesn't exist (but I will 
add it, if this experiment shows an improvement for you).



> On Apr 11, 2016, at 4:15 PM, Larry Gritz <[email protected]> wrote:
> 
> Are you going through the images sequentially? Does your app have any 
> knowledge about when it will not need a file for a while? Or is it truly 
> unpredictable random access?
> 
> Can you explain a little bit more about the access pattern? Are you often 
> just checking attributes, or are you accessing pixels?
> 
> Also, which platform?
> 
> 
>> On Apr 11, 2016, at 4:05 PM, Jonathan Tilden (2K) <[email protected]> 
>> wrote:
>> 
>> Hi all!
>> 
>> We're running into an issue where we are seeing a great decrease in 
>> performance when opening 1000s (23,317, specifically) of files with image 
>> buffers in Python. 
>> Initially, each image is very fast (many per second), but as time goes it, 
>> the perf drops to < 1 image/sec.
>> 
>> Our problem is that we were looking at dds files to pull out the compression 
>> (using the "compression" attribute on the image spec). Suspecting it was 
>> maybe an issue with dds io, we just switched to more generic images, with a 
>> more generic attribute query:
>> 
>> import OpenImageIO as oiio
>> 
>> def get_resolution_from_attr(file):
>>      """Getting the resolution from an attribute. Yes there are better ways 
>> to get resolution :)"""
>>      buf = oiio.ImageBuf(file)
>>      spec = buf.spec()
>>      xres = spec.get_attribute('XResolution')
>>      yres = spec.get_attribute('YResolution')
>>      buf.clear() # Didn't seem to make much of a difference
>>      return file, xres, yres
>> 
>> Before we get all deep with the debugger - is there anything obvious that 
>> we're doing here that might cause the gradual slowdown of the image buffers? 
>> Not using image buffers isn't much of an option for us as we need to do some 
>> image-buffery stuff with them later. 
>> 
>> Thanks!
>> -J
>>      
> 
> --
> Larry Gritz
> [email protected]
> 
> 
> _______________________________________________
> Oiio-dev mailing list
> [email protected]
> http://lists.openimageio.org/listinfo.cgi/oiio-dev-openimageio.org

--
Larry Gritz
[email protected]


_______________________________________________
Oiio-dev mailing list
[email protected]
http://lists.openimageio.org/listinfo.cgi/oiio-dev-openimageio.org

Reply via email to