* Nathan Carl Summers <[EMAIL PROTECTED]> [030813 15:39]:
> > dunno why an editor should do progressive load. Load smaller res in
> > case of problem? I would try to avoid that instead of try to fix it,
> > with proper storage and transmission. Load with proxy images? Too
> > rough, IMO, it is not a scaled down version.
> Well, working a scaled-down version of large files is an important
> optimization.  It's true that not all image manipulation functions can
> credibly be approximated with working on a scaled-down version, but that's
> for the gegl people to worry about.
> My guess is that it will be easier to use interlaced data than true
> scaled-down images, and the savings in terms of computational time and
> pipeline flexablity will be worth it.

Ideally GEGL will collapse all affine transformations, thus doing
resampling only once,. that resample should ideally be from the original
data, possible being stored in a tile cache for following calculations
of the compositing graph. If the ability to directly from file use a
scaled-down version of the image one should rather use a specialiced
image format storing a image pyramid (sizes 50%, 25%, 12.5%, 6.25%, etc) 
that allows the gegl faucet node providing the image to use scale factor 
as a parameter when loading. 

For general operation this will not be of great importance, and thus a
format providing a linear buffer be better. For 8 and 16bit integer data
uncompressed png would provide random access if within a container file
format. A compressed png file would be a little harder, but by making an
intelligent png loader, one could get a row of tiles without much
overhead, (uncompressing the preceeding tiles without actually storing
the data)


  /V\    Øyvind Kolås,  Gjøvik University College, Norway 
  ^ ^    
Gimp-developer mailing list

Reply via email to