After a bit of testing online photo-editing apps (Photoshop.com, Aviary, 
Picnik), I think my strategy has changed:

(1) Make the change using a ShaderJob.

(2) Modify the hue for all pixels using a ShaderJob.

(3) Make the change using a ShaderJob.

(1st undo) Replace current bitmap with original bitmap. Perform steps 1 and 2.

(2nd undo) Replace current bitmap with original bitmap. Perform step 1.

(3rd undo) Replace current bitmap with original bitmap.

This allows for less memory usage but it's more computational on undos.  It 
seems like some of the apps tested store full bitmap data objects for the last 
one or two steps performed--allowing for quick execution when the user hits 
undo the first couple times.  If the user continues to hit undo, it goes 
through the process of getting the original image and then applying all the 
changes since the original up to the specified undo step.  This can get really 
intense if you have a lot of steps (which actually causes the apps to crash) 
but I suppose it beats storing a bunch of bitmap data for each step.

These are my observations anyway.  I'd love to hear from someone who's had more 
experience with this though.

Aaron


--- In flexcoders@yahoogroups.com, "aaronius9er9er" <aaronius...@...> wrote:
>
> Hey coders,
>  
> I'm trying to implement a somewhat simplified version of Photoshop.com or 
> Aviary Phoenix for a client.  We're providing options to the user to perform 
> photo-wide changes like hue/brightness/contrast and scoped changes like 
> fixing blemishes, red eye, etc.  Right now we have things working okay.  
> We're applying photo-wide changes using a ShaderFilter on the main sprite and 
> the scoped changes are children sprites with bitmap fills created from the 
> result of ShaderJobs.  To redo/undo we just remove/add the associated 
> ShaderFilter for photo-wide changes or remove/add the associated sprites for 
> scoped changes.
> 
> This works nicely but we've run into the following problems:
>  
> (1) When the user zooms in on the image so that the image is really large, we 
> get this warning: "Warning: Filter will not render.  The DisplayObject's 
> filtered dimensions (4820, 3615) are too large to be drawn." and the filters 
> disappear.  I understand filters don't work once the bitmap gets beyond 
> 16,777,215 pixels (even though the original bitmap data is smaller than 
> that).  This is too limiting for our needs.
>  
> (2) If the user sets a hue and then makes scoped changes like fixing red eye, 
> it takes a few seconds for the changes to occur.  Without the hue set 
> beforehand, it's very fast.  While the red-eye fix is always processed 
> quickly (~2ms), it appears that the time delay occurs when the hue filter has 
> to re-execute.  I'm assuming it's re-executing anyway...that's my 
> understanding of ShaderFilters.
>  
> So, I went looking at Photoshop.com and Aviary and both seem to let you zoom 
> into an image really far (seemingly larger than 16,777,215 pixels), set a 
> hue, and see the results.  I would assume they're modifying the actual pixels 
> instead of using a ShaderFilter?
> 
> If this is the case, then how are they managing undo/redo?  Here are my 
> thoughts but I'd appreciate some confirmation or correction from someone 
> who's more experienced then I in this area.
> 
> Let's say the user (1) uses the red eye tool (scoped change), (2) changes the 
> hue (photo-wide change), then (3) uses the blemish tool.  Then the user hits 
> undo, undo, undo.  Here's how I was thinking about performing these actions:
> 
> (1) Store the bitmap data of the area that will be affected for undo.  Make 
> the change using a ShaderJob.
> 
> (2) Modify the hue for all pixels using a ShaderJob.  No bitmap data is 
> stored for undo, only the previous hue value.
> 
> (3) Store the bitmap data of the area that will be affected.  Make the change 
> using a ShaderJob.
> 
> (1st undo) Replace the affected bitmap data with the stored bitmap data.
> 
> (2nd undo) Execute a ShaderJob with the previous hue value.
> 
> (3rd undo) Replace the affected bitmap data with the stored bitmap data.
> 
> What makes me queezy about this is we could potentially be storing quite a 
> bit of bitmap data for undo.  In some cases I think we might be able to run 
> the reverse of a shaderjob for undo instead of just storing the previous 
> bitmap data, but I don't think that's possible in some cases.
> 
> Am I way off here or am I on the right track?  Thanks.
> 
> Aaron
>


Reply via email to