Though I have never worked with such large images, don't you think it
would be a good idea to save each TIFF as XCF, do whatever you want to
do on XCF and then save the modified XCF as TIFF again? May be odd
behavior will go away that way because XCF is the native file format.
May be increasing the tile cache will work with the XCF!
On 4/24/05, [EMAIL PROTECTED]
<[EMAIL PROTECTED]> wrote:
> Date: Sat, 23 Apr 2005 20:03:07 -0600
> From: jim feldman <[EMAIL PROTECTED]>
> To: firstname.lastname@example.org
> Subject: [Gimp-user] Odd behavoir with big images and memory
> I'm working with scanned medium format film images that are TIFF's of 100MB
> each. The GIMP environment is gimp 2.2.6 (built from ports about a week
> on FreeBSD 5.3 Release. The display is a Linux (RH9) box. The tiff's are
> created by vuescan on linux.
> The FreeBSD box was running with "only" 512MB memory. GIMP and the
> OS paged so much, the disk light went solid red for 2 minutes every time I
> the image. I doubled the system memory, and figured I should set GIMP's
> cache up to 600MB. I load the first image, and gimp tells me the image is
> 6228x5117, True color, and 247 MB in memory. I then tried to filters>
> decompose> RGB (so I could play with B&W) and GIMP died. I've attached a
> from a run that included stack trace mode and debug handlers. we died in
> gmem.c trying to allocate 8192 bytes. If I set the tile cache back down to
> 400MB however, everything works fine. 500MB also caused it to crash. I fI
> don't instrument it, I get a script-fu:29966: LibGimpBase-WARNING **:
> script-fu: wire_read(): error before it exits.
> Bugzilla time?
Gimp-user mailing list