after some more thinking, here's my reasoning why indeed 'Save' is the problem.
This becomes clear in comparison with a database-driven, version-controlled
as e.g. sketched in the last mockup of . A Save command is superfluous in
scenario. When an image gets edited, it's reasonable to assume the user wants
the changes. Otherwise she can undo, revert (= bulk undo) or simply delete the
The only IO-commands required here are Import and Export, to exchange images
the world and the database. Easy interface _and_ technically clean, i think.
Now current GIMP already employs sort of a temporary database of
images: that is the working set of currently opened images.
Prior to editing, images are opened (=imported) into the working set.
The equivalent for Export is Save-a-Copy.
The major difference from a user's point of view is that the composition
explicitely has to be saved to disk, otherwise it gets lost.
So obviously the problem must be rooted in having to Save manually,
which is a legacy concept from the era of floppy disks .
The current spec builds a clean model on top of that legacy Save concept.
And the result is not as easy as we all would like it to be. That a lot of
effort is required to communicate the model in the UI is probably a sign of
In mid-term i see GIMP going the database-driven path anyway , but for now
we clearly have to support the classic Open/Edit/Save cycle. For that scenario,
it seems we can't have both of easy and clean. So i think it's worthwile
to reconsider easy-but-dirty models.
Notes sorted from on-topic to off-topic:
 slightly related, i was quite surprised to find that messing with files also
builds a good share of the accidental complexities of batch processing.
 the canonical objections are that version control is too expensive for
graphical work and that databases lead to application lock-in.
With GEGL under the hood, the first is not valid anymore. In general,
it is wasteful to store a second XCF instead of a diff of the GEGL tree.
And storing each composition in a self-contained file will face efficiency
problems as well: consider a composition created from 5 JPEGs of GPixel
size each. Should all the source data be duplicated in the XCF?
Application lock-in is a serious concern, though. I think there's concensus
that at desktop level, hierarchical file systems don't serve users well
due to their thousands of multimedia documents. Applications like F-Spot
maintain their own databases, which leads to lock-in or at best duplicated
So ideally, these databases have to be provided by the desktop environment,
and that's the point when GIMP will shurely jump on that train.
Anyhow, it is debatable wether a private database for GIMP causes
lock-in any worse than the XCF format already does now.
Gimp-developer mailing list