Nathan C Summers <[EMAIL PROTECTED]> writes:
> Problem: Many tools instruct the core to destroy themselves on certain
> kinds of state changes, such as a change of image or display. While some
> tools are quite good at handling these changes, others are quite unstable
> psychologically and commit hara-kari for the smallest reasons.
This problem should go away as soon as all tools are proper objects
with init and finalize functions, destroy signals etc.
> This is inefficient. It complicates the tool handling code, causes a lot of
> unneccesary frees, mallocs and initialzations, and seems to me to be a lot
> like "Windows must restart for these changes to take effect." On the
> other hand, multiple instances of a tool should exist for diffent input
> devices so that they can be in different states.
I do not think object destruction and creation every now and then causes
significant overhead. Tools should be created when needed and destroyed
when the last refcount drops. I see no advantage in keeping refcounts on
unused tool objects.
> Proposed Solution: Tools should just deal with having thier state
> changed. We can introduce a function tool_manager_get_tool that takes a
> tool class and input device and returns the correct tool (creating it on
> the fly if needed). The toolbox would just call that function and set the
> current tool on that device to that. active_tool should just go away.
> (so should iterating over the list of registered tools, perhaps)
IMO the GimpToolInfo object Mitch just introduced to resurrect the tool
system is the right way to go. GimpToolInfo objects stay around so Gimp
knows what tools are available, can display icons, etc., the real GimpTool
only exists when needed.
> Problem: Some tools, such as iscissors, keep around a lot of cached data
> generated from the image they are attached to. Changing the image they
> are working on clears this cache. This can be slow when working on
> multiple images or layers.
Is there really a noticeable slowdown?
> Proposed Solution: a generic object, ToolCache, from which the specific
> kind of cache would be derived. A virtual function, compute_cache, would
> compute the value to be cached.
> For efficiency reasons, the cache may either be generated on-the-fly when
> its values are requested, or whenever the target changes. However, if the
> cache is not accessed after a certain number of changes it automagically
> switches to on-the-fly mode to conserve CPU cycles.
> dynamic_timeout (number of times the target can change until it switches
> to on-the-fly mode)
> dynamic_countdown (how many times remain)
> tool_cache_get (gets the data that is cached, generating on-the-fly if
> <the obligatory get and set methods>
> target_changed (the object whose data is being cached has changed)
> cache_flushed (sent whenever the cache has been flushed and filled with
> new values)
> compute_cache (mandatory virtual function to set the new values in the
> target_destroyed (The target may be destroyed, but I'm not quite dead
> Problem: The number of ToolCaches should be kept down to a reasonable
> Proposed Solution: have a maximum number a tool can have. Make it
> configurable, perhaps on a per-tool basis.
This sounds overly complicated and I'm not convinced that there's a reason
to introduce such a complex system. I might be wrong and simply not seeing
your point here.
> P. S. The new implementations of DrawCore (now called DrawingTool) and
> PaintCore (now called PaintingTool) will be commited as soon as I finish
> the game of freeciv I'm playing.
All Gimp objects start with the word Gimp, so it should be GimpDrawingTool
and GimpPaintingTool (btw, why the "ing" ??). If I remember correctly, the
DrawCore is that ugly thing we use to draw on the display, so it should
probably totally go away and be implemented on top of the yet to be written
GimpPaintingTool is meant as the base class all paint tools derive from?!
I think it would help if you could send more info about your plans for
the Gimp's tool system.
Gimp-developer mailing list